Sample records for describing function analysis

  1. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Anderson, M. R.; Schmidt, D. K.

    1986-01-01

    In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.

  2. Describing Function Techniques for the Non-Linear Analysis of the Dynamics of a Rail Vehicle Wheelset

    DOT National Transportation Integrated Search

    1975-07-01

    The describing function method of analysis is applied to investigate the influence of parametric variations on wheelset critical velocity. In addition, the relationship between the amplitude of sustained lateral oscillations and critical speed is der...

  3. A non-linear regression analysis program for describing electrophysiological data with multiple functions using Microsoft Excel.

    PubMed

    Brown, Angus M

    2006-04-01

    The objective of this present study was to demonstrate a method for fitting complex electrophysiological data with multiple functions using the SOLVER add-in of the ubiquitous spreadsheet Microsoft Excel. SOLVER minimizes the difference between the sum of the squares of the data to be fit and the function(s) describing the data using an iterative generalized reduced gradient method. While it is a straightforward procedure to fit data with linear functions, and we have previously demonstrated a method of non-linear regression analysis of experimental data based upon a single function, it is more complex to fit data with multiple functions, usually requiring specialized expensive computer software. In this paper we describe an easily understood program for fitting experimentally acquired data, in this case the stimulus-evoked compound action potential from the mouse optic nerve, with multiple Gaussian functions. The program is flexible and can be applied to describe data with a wide variety of user-input functions.

  4. Mission analysis for cross-site transfer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riesenweber, S.D.; Fritz, R.L.; Shipley, L.E.

    1995-11-01

    The Mission Analysis Report describes the requirements and constraints associated with the Transfer Waste Function as necessary to support the Manage Tank Waste, Retrieve Waste, and Process Tank Waste Functions described in WHC-SD-WM-FRD-020, Tank Waste Remediation System (TWRS) Functions and Requirements Document and DOE/RL-92-60, Revision 1, TWRS Functions and Requirements Document, March 1994. It further assesses the ability of the ``initial state`` (or current cross-site transfer system) to meet the requirements and constraints.

  5. Some computational techniques for estimating human operator describing functions

    NASA Technical Reports Server (NTRS)

    Levison, W. H.

    1986-01-01

    Computational procedures for improving the reliability of human operator describing functions are described. Special attention is given to the estimation of standard errors associated with mean operator gain and phase shift as computed from an ensemble of experimental trials. This analysis pertains to experiments using sum-of-sines forcing functions. Both open-loop and closed-loop measurement environments are considered.

  6. The Identification of Software Failure Regions

    DTIC Science & Technology

    1990-06-01

    be used to detect non-obviously redundant test cases. A preliminary examination of the manual analysis method is performed with a set of programs ...failure regions are defined and a method of failure region analysis is described in detail. The thesis describes how this analysis may be used to detect...is the termination of the ability of a functional unit to perform its required function. (Glossary, 1983) The presence of faults in program code

  7. Geometric Analysis of Wing Sections

    DOT National Transportation Integrated Search

    1995-04-01

    This paper describes a new geometric analysis procedure for wing sections. This procedure is based on the normal mode analysis for continuous functions. A set of special shape functions is introduced to represent the geometry of the wing section. The...

  8. Random harmonic analysis program, L221 (TEV156). Volume 1: Engineering and usage

    NASA Technical Reports Server (NTRS)

    Miller, R. D.; Graham, M. L.

    1979-01-01

    A digital computer program capable of calculating steady state solutions for linear second order differential equations due to sinusoidal forcing functions is described. The field of application of the program, the analysis of airplane response and loads due to continuous random air turbulence, is discussed. Optional capabilities including frequency dependent input matrices, feedback damping, gradual gust penetration, multiple excitation forcing functions, and a static elastic solution are described. Program usage and a description of the analysis used are presented.

  9. SASS wind ambiguity removal by direct minimization. II - Use of smoothness and dynamical constraints

    NASA Technical Reports Server (NTRS)

    Hoffman, R. N.

    1984-01-01

    A variational analysis method (VAM) is used to remove the ambiguity of the Seasat-A Satellite Scatterometer (SASS) winds. The VAM yields the best fit to the data by minimizing an objective function S which is a measure of the lack of fit. The SASS data are described and the function S and the analysis procedure are defined. Analyses of a single ship report which are analogous to Green's functions are presented. The analysis procedure is tuned and its sensitivity is described using the QE II storm. The procedure is then applied to a case study of September 6, 1978, south of Japan.

  10. Functional Behavioral Assessment: A School Based Model.

    ERIC Educational Resources Information Center

    Asmus, Jennifer M.; Vollmer, Timothy R.; Borrero, John C.

    2002-01-01

    This article begins by discussing requirements for functional behavioral assessment under the Individuals with Disabilities Education Act and then describes a comprehensive model for the application of behavior analysis in the schools. The model includes descriptive assessment, functional analysis, and intervention and involves the participation…

  11. Production of Printed Indexes of Chemical Reactions. I. Analysis of Functional Group Interconversions

    ERIC Educational Resources Information Center

    Clinging, R.; Lynch, M. F.

    1973-01-01

    A program is described which identifies functional group interconversion reactions, hydrogenations, and dehydrogenations in a data base containing structures encoded as Wiswesser Line Notations. Production of the data base is briefly described. (17 references) (Authors)

  12. Research study on stabilization and control: Modern sampled-data control theory. Continuous and discrete describing function analysis of the LST system. [with emphasis on the control moment gyroscope control loop

    NASA Technical Reports Server (NTRS)

    Kuo, B. C.; Singh, G.

    1974-01-01

    The dynamics of the Large Space Telescope (LST) control system were studied in order to arrive at a simplified model for computer simulation without loss of accuracy. The frictional nonlinearity of the Control Moment Gyroscope (CMG) Control Loop was analyzed in a model to obtain data for the following: (1) a continuous describing function for the gimbal friction nonlinearity; (2) a describing function of the CMG nonlinearity using an analytical torque equation; and (3) the discrete describing function and function plots for CMG functional linearity. Preliminary computer simulations are shown for the simplified LST system, first without, and then with analytical torque expressions. Transfer functions of the sampled-data LST system are also described. A final computer simulation is presented which uses elements of the simplified sampled-data LST system with analytical CMG frictional torque expressions.

  13. Functional Relationships and Regression Analysis.

    ERIC Educational Resources Information Center

    Preece, Peter F. W.

    1978-01-01

    Using a degenerate multivariate normal model for the distribution of organismic variables, the form of least-squares regression analysis required to estimate a linear functional relationship between variables is derived. It is suggested that the two conventional regression lines may be considered to describe functional, not merely statistical,…

  14. A Top Level Analysis of Training Management Functions.

    ERIC Educational Resources Information Center

    Ackerson, Jack

    1995-01-01

    Discusses how to conduct a top-level analysis of training management functions to identify problems within a training system resulting from rapid growth, the acquisition of new departments, or mergers. The data gathering process and analyses are explained, training management functions and activities are described, and root causes and solutions…

  15. Classroom-Based Strategies to Incorporate Hypothesis Testing in Functional Behavior Assessments

    ERIC Educational Resources Information Center

    Lloyd, Blair P.; Weaver, Emily S.; Staubitz, Johanna L.

    2017-01-01

    When results of descriptive functional behavior assessments are unclear, hypothesis testing can help school teams understand how the classroom environment affects a student's challenging behavior. This article describes two hypothesis testing strategies that can be used in classroom settings: structural analysis and functional analysis. For each…

  16. Manipulations of Cartesian Graphs: A First Introduction to Analysis.

    ERIC Educational Resources Information Center

    Lowenthal, Francis; Vandeputte, Christiane

    1989-01-01

    Introduces an introductory module for analysis. Describes stock of basic functions and their graphs as part one and three methods as part two: transformations of simple graphs, the sum of stock functions, and upper and lower bounds. (YP)

  17. The Function sin x/x.

    ERIC Educational Resources Information Center

    Gearhart, William B.; Shultz, Harris S.

    1990-01-01

    Presents some examples from geometry: area of a circle; centroid of a sector; Buffon's needle problem; and expression for pi. Describes several roles of the trigonometric function in mathematics and applications, including Fourier analysis, spectral theory, approximation theory, and numerical analysis. (YP)

  18. Network analysis of mesoscale optical recordings to assess regional, functional connectivity.

    PubMed

    Lim, Diana H; LeDue, Jeffrey M; Murphy, Timothy H

    2015-10-01

    With modern optical imaging methods, it is possible to map structural and functional connectivity. Optical imaging studies that aim to describe large-scale neural connectivity often need to handle large and complex datasets. In order to interpret these datasets, new methods for analyzing structural and functional connectivity are being developed. Recently, network analysis, based on graph theory, has been used to describe and quantify brain connectivity in both experimental and clinical studies. We outline how to apply regional, functional network analysis to mesoscale optical imaging using voltage-sensitive-dye imaging and channelrhodopsin-2 stimulation in a mouse model. We include links to sample datasets and an analysis script. The analyses we employ can be applied to other types of fluorescence wide-field imaging, including genetically encoded calcium indicators, to assess network properties. We discuss the benefits and limitations of using network analysis for interpreting optical imaging data and define network properties that may be used to compare across preparations or other manipulations such as animal models of disease.

  19. HAL/SM system functional design specification. [systems analysis and design analysis of central processing units

    NASA Technical Reports Server (NTRS)

    Ross, C.; Williams, G. P. W., Jr.

    1975-01-01

    The functional design of a preprocessor, and subsystems is described. A structure chart and a data flow diagram are included for each subsystem. Also a group of intermodule interface definitions (one definition per module) is included immediately following the structure chart and data flow for a particular subsystem. Each of these intermodule interface definitions consists of the identification of the module, the function the module is to perform, the identification and definition of parameter interfaces to the module, and any design notes associated with the module. Also described are compilers and computer libraries.

  20. IDIMS/GEOPAK: Users manual for a geophysical data display and analysis system

    NASA Technical Reports Server (NTRS)

    Libert, J. M.

    1982-01-01

    The application of an existing image analysis system to the display and analysis of geophysical data is described, the potential for expanding the capabilities of such a system toward more advanced computer analytic and modeling functions is investigated. The major features of the IDIMS (Interactive Display and Image Manipulation System) and its applicability for image type analysis of geophysical data are described. Development of a basic geophysical data processing system to permit the image representation, coloring, interdisplay and comparison of geophysical data sets using existing IDIMS functions and to provide for the production of hard copies of processed images was described. An instruction manual and documentation for the GEOPAK subsystem was produced. A training course for personnel in the use of the IDIMS/GEOPAK was conducted. The effectiveness of the current IDIMS/GEOPAK system for geophysical data analysis was evaluated.

  1. Functional Analysis of All Salmonid Genomes (FAASG): an international initiative supporting future salmonid research, conservation and aquaculture

    USDA-ARS?s Scientific Manuscript database

    We describe an emerging initiative - the 'Functional Analysis of All Salmonid Genomes' (FAASG), which will leverage the extensive trait diversity that has evolved since a whole genome duplication event in the salmonid ancestor, to develop an integrative understanding of the functional genomic basis ...

  2. Graph theory analysis of complex brain networks: new concepts in brain mapping applied to neurosurgery.

    PubMed

    Hart, Michael G; Ypma, Rolf J F; Romero-Garcia, Rafael; Price, Stephen J; Suckling, John

    2016-06-01

    Neuroanatomy has entered a new era, culminating in the search for the connectome, otherwise known as the brain's wiring diagram. While this approach has led to landmark discoveries in neuroscience, potential neurosurgical applications and collaborations have been lagging. In this article, the authors describe the ideas and concepts behind the connectome and its analysis with graph theory. Following this they then describe how to form a connectome using resting state functional MRI data as an example. Next they highlight selected insights into healthy brain function that have been derived from connectome analysis and illustrate how studies into normal development, cognitive function, and the effects of synthetic lesioning can be relevant to neurosurgery. Finally, they provide a précis of early applications of the connectome and related techniques to traumatic brain injury, functional neurosurgery, and neurooncology.

  3. Visually enhanced CCTV digital surveillance utilizing Intranet and Internet.

    PubMed

    Ozaki, Nobuyuki

    2002-07-01

    This paper describes a solution for integrated plant supervision utilizing closed circuit television (CCTV) digital surveillance. Three basic requirements are first addressed as the platform of the system, with discussion on the suitable video compression. The system configuration is described in blocks. The system provides surveillance functionality: real-time monitoring, and process analysis functionality: a troubleshooting tool. This paper describes the formulation of practical performance design for determining various encoder parameters. It also introduces image processing techniques for enhancing the original CCTV digital image to lessen the burden on operators. Some screenshots are listed for the surveillance functionality. For the process analysis, an image searching filter supported by image processing techniques is explained with screenshots. Multimedia surveillance, which is the merger with process data surveillance, or the SCADA system, is also explained.

  4. The analysis of mathematics teachers' learning on algebra function limit material based on teaching experience difference

    NASA Astrophysics Data System (ADS)

    Ma'rufi, Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    The aim of this study was to describe the analysis of mathematics teachers' learning on algebra function limit material based on teaching experience difference. The purpose of this study is to describe the analysis of mathematics teacher's learning on limit algebraic functions in terms of the differences of teaching experience. Learning analysis focused on Pedagogical Content Knowledge (PCK) of teachers in mathematics on limit algebraic functions related to the knowledge of pedagogy. PCK of teachers on limit algebraic function is a type of specialized knowledge for teachers on how to teach limit algebraic function that can be understood by students. Subjects are two high school mathematics teacher who has difference of teaching experience they are one Novice Teacher (NP) and one Experienced Teacher (ET). Data are collected through observation of learning in the class, videos of learning, and then analyzed using qualitative analysis. Teacher's knowledge of Pedagogic defined as a knowledge and understanding of teacher about planning and organizing of learning, and application of learning strategy. The research results showed that the Knowledge of Pedagogy on subject NT in mathematics learning on the material of limit function algebra showed that the subject NT tended to describe procedurally, without explaining the reasons why such steps were used, asking questions which tended to be monotonous not be guiding and digging deeper, and less varied in the use of learning strategies while subject ET gave limited guidance and opportunities to the students to find their own answers, exploit the potential of students to answer questions, provide an opportunity for students to interact and work in groups, and subject ET tended to combine conceptual and procedural explanation.

  5. A Scalable Nonuniform Pointer Analysis for Embedded Program

    NASA Technical Reports Server (NTRS)

    Venet, Arnaud

    2004-01-01

    In this paper we present a scalable pointer analysis for embedded applications that is able to distinguish between instances of recursively defined data structures and elements of arrays. The main contribution consists of an efficient yet precise algorithm that can handle multithreaded programs. We first perform an inexpensive flow-sensitive analysis of each function in the program that generates semantic equations describing the effect of the function on the memory graph. These equations bear numerical constraints that describe nonuniform points-to relationships. We then iteratively solve these equations in order to obtain an abstract storage graph that describes the shape of data structures at every point of the program for all possible thread interleavings. We bring experimental evidence that this approach is tractable and precise for real-size embedded applications.

  6. Small Oscillations via Conservation of Energy

    ERIC Educational Resources Information Center

    Troy, Tia; Reiner, Megan; Haugen, Andrew J.; Moore, Nathan T.

    2017-01-01

    The work describes an analogy-based small oscillations analysis of a standard static equilibrium lab problem. In addition to force analysis, a potential energy function for the system is developed, and by drawing out mathematical similarities to the simple harmonic oscillator, we are able to describe (and experimentally verify) the period of small…

  7. The structure and function of fungal cells

    NASA Technical Reports Server (NTRS)

    Nozawa, Y.

    1984-01-01

    The structure and function of fungal cell walls were studied with particular emphasis on dermatophytes. Extraction, isolation, analysis, and observation of the cell wall structure and function were performed. The structure is described microscopically and chemically.

  8. Project FAST: [Functional Analysis Systems Training]: Adopter/Facilitator Information.

    ERIC Educational Resources Information Center

    Essexville-Hampton Public Schools, MI.

    Presented is adopter/facilitator information of Project FAST (Functional Analysis Systems Training) to provide educational and support services to learning disordered children and their regular elementary teachers. Briefly described are the three schools in the Essexville-Hampton (Michigan) school district; objectives of the program; program…

  9. Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.

    PubMed

    Chen, Rong; Nixon, Erika; Herskovits, Edward

    2016-04-01

    Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.

  10. Graphic design of pinhole cameras

    NASA Technical Reports Server (NTRS)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  11. A Factor Analysis of Functional Independence and Functional Assessment Measure Scores Among Focal and Diffuse Brain Injury Patients: The Importance of Bifactor Models.

    PubMed

    Gunn, Sarah; Burgess, Gerald H; Maltby, John

    2018-04-30

    To explore the factor structure of the UK Functional Independence Measure and Functional Assessment Measure (FIM+FAM) among focal and diffuse acquired brain injury patients. Criterion standard. A National Health Service acute acquired brain injury inpatient rehabilitation hospital. Referred sample of N=447 adults admitted for inpatient treatment following an acquired brain injury significant enough to justify intensive inpatient neurorehabilitation INTERVENTION: Not applicable. Functional Independence Measure and Functional Assessment Measure. Exploratory factor analysis suggested a 2-factor structure to FIM+FAM scores, among both focal-proximate and diffuse-proximate acquired brain injury aetiologies. Confirmatory factor analysis suggested a 3-factor bifactor structure presented the best fit of the FIM+FAM score data across both aetiologies. However, across both analyses, a convergence was found towards a general factor, demonstrated by high correlations between factors in the exploratory factor analysis, and by a general factor explaining the majority of the variance in scores on confirmatory factor analysis. Our findings suggested that although factors describing specific functional domains can be derived from FIM+FAM item scores, there is a convergence towards a single factor describing overall functioning. This single factor informs the specific group factors (eg, motor, psychosocial, and communication function) after brain injury. Further research into the comparative value of the general and group factors as evaluative/prognostic measures is indicated. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. Program design by a multidisciplinary team. [for structural finite element analysis on STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Voigt, S.

    1975-01-01

    The use of software engineering aids in the design of a structural finite-element analysis computer program for the STAR-100 computer is described. Nested functional diagrams to aid in communication among design team members were used, and a standardized specification format to describe modules designed by various members was adopted. This is a report of current work in which use of the functional diagrams provided continuity and helped resolve some of the problems arising in this long-running part-time project.

  13. Automation Applications in an Advance Air Traffic Management System : Volume IIB : Functional Analysis of Air Traffic Management (Cont'd)

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...

  14. Automated Applications in an Advanced Air Traffic Management System : Volume 2B. Functional Analysis of Air Traffic Management (Cont'd.)

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...

  15. Automation Applications in an Advanced Air Traffic Management System : Volume 2A. Functional Analysis of Air Traffic Management.

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...

  16. Automation Applications in an Advanced Air Traffic Management System : Volume 2C. Functional Analysis of Air Traffic Management (Cont.'d)

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...

  17. INFANT SIGN TRAINING AND FUNCTIONAL ANALYSIS

    PubMed Central

    Normand, Matthew P; Machado, Mychal A; Hustyi, Kristin M; Morley, Allison J

    2011-01-01

    We taught manual signs to typically developing infants using a reversal design and caregiver-nominated stimuli. We delivered the stimuli on a time-based schedule during baseline. During the intervention, we used progressive prompting and reinforcement, described by Thompson et al. (2004, 2007), to establish mands. Following sign training, we conducted functional analyses and verified that the signs functioned as mands. These results provide preliminary validation for the verbal behavior functional analysis methodology and further evidence of the functional independence of verbal operants. PMID:21709786

  18. A basis for a visual language for describing, archiving and analyzing functional models of complex biological systems

    PubMed Central

    Cook, Daniel L; Farley, Joel F; Tapscott, Stephen J

    2001-01-01

    Background: We propose that a computerized, internet-based graphical description language for systems biology will be essential for describing, archiving and analyzing complex problems of biological function in health and disease. Results: We outline here a conceptual basis for designing such a language and describe BioD, a prototype language that we have used to explore the utility and feasibility of this approach to functional biology. Using example models, we demonstrate that a rather limited lexicon of icons and arrows suffices to describe complex cell-biological systems as discrete models that can be posted and linked on the internet. Conclusions: Given available computer and internet technology, BioD may be implemented as an extensible, multidisciplinary language that can be used to archive functional systems knowledge and be extended to support both qualitative and quantitative functional analysis. PMID:11305940

  19. Detailed requirements document for the integrated structural analysis system, phase B

    NASA Technical Reports Server (NTRS)

    Rainey, J. A.

    1976-01-01

    The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.

  20. Tomato functional genomics database (TFGD): a comprehensive collection and analysis package for tomato functional genomics

    USDA-ARS?s Scientific Manuscript database

    Tomato Functional Genomics Database (TFGD; http://ted.bti.cornell.edu) provides a comprehensive systems biology resource to store, mine, analyze, visualize and integrate large-scale tomato functional genomics datasets. The database is expanded from the previously described Tomato Expression Database...

  1. Functional connectomics from resting-state fMRI

    PubMed Central

    Smith, Stephen M; Vidaurre, Diego; Beckmann, Christian F; Glasser, Matthew F; Jenkinson, Mark; Miller, Karla L; Nichols, Thomas E; Robinson, Emma; Salimi-Khorshidi, Gholamreza; Woolrich, Mark W; Barch, Deanna M; Uğurbil, Kamil; Van Essen, David C

    2014-01-01

    Spontaneous fluctuations in activity in different parts of the brain can be used to study functional brain networks. We review the use of resting-state functional MRI for the purpose of mapping the macroscopic functional connectome. After describing MRI acquisition and image processing methods commonly used to generate data in a form amenable to connectomics network analysis, we discuss different approaches for estimating network structure from that data. Finally, we describe new possibilities resulting from the high-quality rfMRI data being generated by the Human Connectome Project, and highlight some upcoming challenges in functional connectomics. PMID:24238796

  2. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    PubMed Central

    Seeley, Matthew K.; Francom, Devin; Reese, C. Shane; Hopkins, J. Ty

    2017-01-01

    Abstract In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function. PMID:29339984

  3. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach.

    PubMed

    Park, Jihong; Seeley, Matthew K; Francom, Devin; Reese, C Shane; Hopkins, J Ty

    2017-12-01

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.

  4. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jihong; Seeley, Matthew K.; Francom, Devin

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle,more » knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. Thus when using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.« less

  5. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    DOE PAGES

    Park, Jihong; Seeley, Matthew K.; Francom, Devin; ...

    2017-12-28

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle,more » knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. Thus when using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.« less

  6. Ozone data and mission sampling analysis

    NASA Technical Reports Server (NTRS)

    Robbins, J. L.

    1980-01-01

    A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.

  7. Self-Injurious Behavior and Functional Analysis: Where Are the Descriptions of Participant Protections?

    ERIC Educational Resources Information Center

    Weeden, Marc; Mahoney, Amanda; Poling, Alan

    2010-01-01

    This study examined the reporting of participant protections in studies involving functional analysis and self-injurious behavior and published from 1994 through 2008. Results indicated that session termination criteria were rarely reported and other specific participant safeguards were seldom described. The absence of such information in no way…

  8. Self-Directed Student Research through Analysis of Microarray Datasets: A Computer-Based Functional Genomics Practical Class for Masters-Level Students

    ERIC Educational Resources Information Center

    Grenville-Briggs, Laura J.; Stansfield, Ian

    2011-01-01

    This report describes a linked series of Masters-level computer practical workshops. They comprise an advanced functional genomics investigation, based upon analysis of a microarray dataset probing yeast DNA damage responses. The workshops require the students to analyse highly complex transcriptomics datasets, and were designed to stimulate…

  9. Assessment and Treatment of Stereotypic Vocalizations in a Taiwanese Adolescent with Autism: A Case Study

    ERIC Educational Resources Information Center

    Wu, Ya-Ping; Mirenda, Pat; Wang, Hwa-Pey; Chen, Ming-Chung

    2010-01-01

    This case study describes the processes of functional analysis and modality assessment that were utilized to design a communication intervention for an adolescent with autism who engaged in loud and disruptive vocalizations for most of the school day. The functional analysis suggested that the vocalizations served both tangible and escape…

  10. VESUVIO Data Analysis Goes MANTID

    NASA Astrophysics Data System (ADS)

    Jackson, S.; Krzystyniak, M.; Seel, A. G.; Gigg, M.; Richards, S. E.; Fernandez-Alonso, F.

    2014-12-01

    This paper describes ongoing efforts to implement the reduction and analysis of neutron Compton scattering data within the MANTID framework. Recently, extensive work has been carried out to integrate the bespoke data reduction and analysis routines written for VESUVIO with the MANTID framework. While the programs described in this document are designed to replicate the functionality of the Fortran and Genie routines already in use, most of them have been written from scratch and are not based on the original code base.

  11. Director Field Analysis (DFA): Exploring Local White Matter Geometric Structure in Diffusion MRI.

    PubMed

    Cheng, Jian; Basser, Peter J

    2018-01-01

    In Diffusion Tensor Imaging (DTI) or High Angular Resolution Diffusion Imaging (HARDI), a tensor field or a spherical function field (e.g., an orientation distribution function field), can be estimated from measured diffusion weighted images. In this paper, inspired by the microscopic theoretical treatment of phases in liquid crystals, we introduce a novel mathematical framework, called Director Field Analysis (DFA), to study local geometric structural information of white matter based on the reconstructed tensor field or spherical function field: (1) We propose a set of mathematical tools to process general director data, which consists of dyadic tensors that have orientations but no direction. (2) We propose Orientational Order (OO) and Orientational Dispersion (OD) indices to describe the degree of alignment and dispersion of a spherical function in a single voxel or in a region, respectively; (3) We also show how to construct a local orthogonal coordinate frame in each voxel exhibiting anisotropic diffusion; (4) Finally, we define three indices to describe three types of orientational distortion (splay, bend, and twist) in a local spatial neighborhood, and a total distortion index to describe distortions of all three types. To our knowledge, this is the first work to quantitatively describe orientational distortion (splay, bend, and twist) in general spherical function fields from DTI or HARDI data. The proposed DFA and its related mathematical tools can be used to process not only diffusion MRI data but also general director field data, and the proposed scalar indices are useful for detecting local geometric changes of white matter for voxel-based or tract-based analysis in both DTI and HARDI acquisitions. The related codes and a tutorial for DFA will be released in DMRITool. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A computer-aided approach to nonlinear control systhesis

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Anthony, Tobin

    1988-01-01

    The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.

  13. Lognormal Distribution of Cellular Uptake of Radioactivity: Statistical Analysis of α-Particle Track Autoradiography

    PubMed Central

    Neti, Prasad V.S.V.; Howell, Roger W.

    2010-01-01

    Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086

  14. Trade-Off Analysis between Concerns Based on Aspect-Oriented Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Laurito, Abelyn Methanie R.; Takada, Shingo

    The identification of functional and non-functional concerns is an important activity during requirements analysis. However, there may be conflicts between the identified concerns, and they must be discovered and resolved through trade-off analysis. Aspect-Oriented Requirements Engineering (AORE) has trade-off analysis as one of its goals, but most AORE approaches do not actually offer support for trade-off analysis; they focus on describing concerns and generating their composition. This paper proposes an approach for trade-off analysis based on AORE using use cases and the Requirements Conflict Matrix (RCM) to represent compositions. RCM shows the positive or negative effect of non-functional concerns over use cases and other non-functional concerns. Our approach is implemented within a tool called E-UCEd (Extended Use Case Editor). We also show the results of evaluating our tool.

  15. Information Services at the Nuclear Safety Analysis Center.

    ERIC Educational Resources Information Center

    Simard, Ronald

    This paper describes the operations of the Nuclear Safety Analysis Center. Established soon after an accident at the Three Mile Island nuclear power plant near Harrisburg, Pennsylvania, its efforts were initially directed towards a detailed analysis of the accident. Continuing functions include: (1) the analysis of generic nuclear safety issues,…

  16. Describing the Elephant: Structure and Function in Multivariate Data.

    ERIC Educational Resources Information Center

    McDonald, Roderick P.

    1986-01-01

    There is a unity underlying the diversity of models for the analysis of multivariate data. Essentially, they constitute a family of models, most generally nonlinear, for structural/functional relations between variables drawn from a behavior domain. (Author)

  17. Comparing rainfall patterns between regions in Peninsular Malaysia via a functional data analysis technique

    NASA Astrophysics Data System (ADS)

    Suhaila, Jamaludin; Jemain, Abdul Aziz; Hamdan, Muhammad Fauzee; Wan Zin, Wan Zawiah

    2011-12-01

    SummaryNormally, rainfall data is collected on a daily, monthly or annual basis in the form of discrete observations. The aim of this study is to convert these rainfall values into a smooth curve or function which could be used to represent the continuous rainfall process at each region via a technique known as functional data analysis. Since rainfall data shows a periodic pattern in each region, the Fourier basis is introduced to capture these variations. Eleven basis functions with five harmonics are used to describe the unimodal rainfall pattern for stations in the East while five basis functions which represent two harmonics are needed to describe the rainfall pattern in the West. Based on the fitted smooth curve, the wet and dry periods as well as the maximum and minimum rainfall values could be determined. Different rainfall patterns are observed among the studied regions based on the smooth curve. Using the functional analysis of variance, the test results indicated that there exist significant differences in the functional means between each region. The largest differences in the functional means are found between the East and Northwest regions and these differences may probably be due to the effect of topography and, geographical location and are mostly influenced by the monsoons. Therefore, the same inputs or approaches might not be useful in modeling the hydrological process for different regions.

  18. Transfer function characteristics of super resolving systems

    NASA Technical Reports Server (NTRS)

    Milster, Tom D.; Curtis, Craig H.

    1992-01-01

    Signal quality in an optical storage device greatly depends on the optical system transfer function used to write and read data patterns. The problem is similar to analysis of scanning optical microscopes. Hopkins and Braat have analyzed write-once-read-many (WORM) optical data storage devices. Herein, transfer function analysis of magnetooptic (MO) data storage devices is discussed with respect to improving transfer-function characteristics. Several authors have described improving the transfer function as super resolution. However, none have thoroughly analyzed the MO optical system and effects of the medium. Both the optical system transfer function and effects of the medium of this development are discussed.

  19. Speckle tracking analysis: a new tool for left atrial function analysis in systemic hypertension: an overview.

    PubMed

    Cameli, Matteo; Ciccone, Marco M; Maiello, Maria; Modesti, Pietro A; Muiesan, Maria L; Scicchitano, Pietro; Novo, Salvatore; Palmiero, Pasquale; Saba, Pier S; Pedrinelli, Roberto

    2016-05-01

    Speckle tracking echocardiography (STE) is an imaging technique applied to the analysis of left atrial function. STE provides a non-Doppler, angle-independent and objective quantification of left atrial myocardial deformation. Data regarding feasibility, accuracy and clinical applications of left atrial strain are rapidly gathering. This review describes the fundamental concepts of left atrial STE, illustrates its pathophysiological background and discusses its emerging role in systemic arterial hypertension.

  20. Multivariate Analysis of Schools and Educational Policy.

    ERIC Educational Resources Information Center

    Kiesling, Herbert J.

    This report describes a multivariate analysis technique that approaches the problems of educational production function analysis by (1) using comparable measures of output across large experiments, (2) accounting systematically for differences in socioeconomic background, and (3) treating the school as a complete system in which different…

  1. Biosensors for Cell Analysis.

    PubMed

    Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander

    2015-01-01

    Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.

  2. Methodological guidelines for developing accident modification functions.

    PubMed

    Elvik, Rune

    2015-07-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. The Effect of Functional Flow Diagrams on Apprentice Aircraft Mechanics' Technical System Understanding.

    ERIC Educational Resources Information Center

    Johnson, Scott D.; Satchwell, Richard E.

    1993-01-01

    Describes an experimental study that tested the impact of a conceptual illustration on college students' understanding of the structure, function, and behavior of complex technical systems. The use of functional flow diagrams in aircraft mechanics' training is explained, a concept map analysis is discussed, and implications for technical training…

  4. A phylogenetic analysis of normal modes evolution in enzymes and its relationship to enzyme function

    PubMed Central

    Lai, Jason; Jin, Jing; Kubelka, Jan; Liberles, David A.

    2012-01-01

    Since the dynamic nature of protein structures is essential for enzymatic function, it is expected that the functional evolution can be inferred from the changes in the protein dynamics. However, dynamics can also diverge neutrally with sequence substitution between enzymes without changes of function. In this study, a phylogenetic approach is implemented to explore the relationship between enzyme dynamics and function through evolutionary history. Protein dynamics are described by normal mode analysis based on a simplified harmonic potential force field applied to the reduced Cα representation of the protein structure while enzymatic function is described by Enzyme Commission (EC) numbers. Similarity of the binding pocket dynamics at each branch of the protein family’s phylogeny was analyzed in two ways: 1) explicitly by quantifying the normal mode overlap calculated for the reconstructed ancestral proteins at each end and 2) implicitly using a diffusion model to obtain the reconstructed lineage-specific changes in the normal modes. Both explicit and implicit ancestral reconstruction identified generally faster rates of change in dynamics compared with the expected change from neutral evolution at the branches of potential functional divergences for the alpha-amylase, D-isomer specific 2-hydroxyacid dehydrogenase, and copper-containing amine oxidase protein families. Normal modes analysis added additional information over just comparing the RMSD of static structures. However, the branch-specific changes were not statistically significant compared to background function-independent neutral rates of change of dynamic properties and blind application of the analysis would not enable prediction of changes in enzyme specificity. PMID:22651983

  5. A phylogenetic analysis of normal modes evolution in enzymes and its relationship to enzyme function.

    PubMed

    Lai, Jason; Jin, Jing; Kubelka, Jan; Liberles, David A

    2012-09-21

    Since the dynamic nature of protein structures is essential for enzymatic function, it is expected that functional evolution can be inferred from the changes in protein dynamics. However, dynamics can also diverge neutrally with sequence substitution between enzymes without changes of function. In this study, a phylogenetic approach is implemented to explore the relationship between enzyme dynamics and function through evolutionary history. Protein dynamics are described by normal mode analysis based on a simplified harmonic potential force field applied to the reduced C(α) representation of the protein structure while enzymatic function is described by Enzyme Commission numbers. Similarity of the binding pocket dynamics at each branch of the protein family's phylogeny was analyzed in two ways: (1) explicitly by quantifying the normal mode overlap calculated for the reconstructed ancestral proteins at each end and (2) implicitly using a diffusion model to obtain the reconstructed lineage-specific changes in the normal modes. Both explicit and implicit ancestral reconstruction identified generally faster rates of change in dynamics compared with the expected change from neutral evolution at the branches of potential functional divergences for the α-amylase, D-isomer-specific 2-hydroxyacid dehydrogenase, and copper-containing amine oxidase protein families. Normal mode analysis added additional information over just comparing the RMSD of static structures. However, the branch-specific changes were not statistically significant compared to background function-independent neutral rates of change of dynamic properties and blind application of the analysis would not enable prediction of changes in enzyme specificity. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. [Hazard function and life table: an introduction to the failure time analysis].

    PubMed

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  7. Instructional influences on analogue functional analysis outcomes.

    PubMed Central

    Northup, John; Kodak, Tiffany; Grow, Laura; Lee, Jennifer; Coyne, Amanda

    2004-01-01

    Analogue assessments were conducted with a common contingency (escape from tasks) that varied only by three different instructions describing the contingency. In one condition, the contingency was described as "taking a break," in another condition it was described as "time-out," and no description of the contingency was provided in a third condition. The participant was a typically developing 5-year-old child with a diagnosis of attention deficit hyperactivity disorder. Rates of inappropriate behavior varied substantially across the three conditions as an apparent effect of the prior instructions. Some implications for conducting functional analyses with verbal children are discussed. PMID:15669409

  8. An introduction to autonomous control systems

    NASA Technical Reports Server (NTRS)

    Antsaklis, Panos J.; Passino, Kevin M.; Wang, S. J.

    1991-01-01

    The functions, characteristics, and benefits of autonomous control are outlined. An autonomous control functional architecture for future space vehicles that incorporates the concepts and characteristics described is presented. The controller is hierarchical, with an execution level (the lowest level), coordination level (middle level), and management and organization level (highest level). The general characteristics of the overall architecture, including those of the three levels, are explained, and an example to illustrate their functions is given. Mathematical models for autonomous systems, including 'logical' discrete event system models, are discussed. An approach to the quantitative, systematic modeling, analysis, and design of autonomous controllers is also discussed. It is a hybrid approach since it uses conventional analysis techniques based on difference and differential equations and new techniques for the analysis of the systems described with a symbolic formalism such as finite automata. Some recent results from the areas of planning and expert systems, machine learning, artificial neural networks, and the area restructurable controls are briefly outlined.

  9. Analysis of Real Ship Rolling Dynamics under Wave Excitement Force Composed of Sums of Cosine Functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y. S.; Cai, F.; Xu, W. M.

    2011-09-28

    The ship motion equation with a cosine wave excitement force describes the slip moments in regular waves. A new kind of wave excitement force model, with the form as sums of cosine functions was proposed to describe ship rolling in irregular waves. Ship rolling time series were obtained by solving the ship motion equation with the fourth-order-Runger-Kutta method. These rolling time series were synthetically analyzed with methods of phase-space track, power spectrum, primary component analysis, and the largest Lyapunove exponent. Simulation results show that ship rolling presents some chaotic characteristic when the wave excitement force was applied by sums ofmore » cosine functions. The result well explains the course of ship rolling's chaotic mechanism and is useful for ship hydrodynamic study.« less

  10. Silicon Drift Detector response function for PIXE spectra fitting

    NASA Astrophysics Data System (ADS)

    Calzolai, G.; Tapinassi, S.; Chiari, M.; Giannoni, M.; Nava, S.; Pazzi, G.; Lucarelli, F.

    2018-02-01

    The correct determination of the X-ray peak areas in PIXE spectra by fitting with a computer program depends crucially on accurate parameterization of the detector peak response function. In the Guelph PIXE software package, GUPIXWin, one of the most used PIXE spectra analysis code, the response of a semiconductor detector to monochromatic X-ray radiation is described by a linear combination of several analytical functions: a Gaussian profile for the X-ray line itself, and additional tail contributions (exponential tails and step functions) on the low-energy side of the X-ray line to describe incomplete charge collection effects. The literature on the spectral response of silicon X-ray detectors for PIXE applications is rather scarce, in particular data for Silicon Drift Detectors (SDD) and for a large range of X-ray energies are missing. Using a set of analytical functions, the SDD response functions were satisfactorily reproduced for the X-ray energy range 1-15 keV. The behaviour of the parameters involved in the SDD tailing functions with X-ray energy is described by simple polynomial functions, which permit an easy implementation in PIXE spectra fitting codes.

  11. Multimission image processing and science data visualization

    NASA Technical Reports Server (NTRS)

    Green, William B.

    1993-01-01

    The Operational Science Analysis (OSA) Functional area supports science instrument data display, analysis, visualization and photo processing in support of flight operations of planetary spacecraft managed by the Jet Propulsion Laboratory (JPL). This paper describes the data products generated by the OSA functional area, and the current computer system used to generate these data products. The objectives on a system upgrade now in process are described. The design approach to development of the new system are reviewed, including use of the Unix operating system and X-Window display standards to provide platform independence, portability, and modularity within the new system, is reviewed. The new system should provide a modular and scaleable capability supporting a variety of future missions at JPL.

  12. Effect of aqueous environment in chemical reactivity of monolignols. A New Fukui Function Study.

    PubMed

    Martínez, Carmen; Sedano, Miriam; Mendoza, Jorge; Herrera, Rafael; Rutiaga, Jose G; Lopez, Pablo

    2009-09-01

    The free radical reactivity of monolignols can be explained in terms of the Fukui function and the local hard and soft acids and bases (HSAB) principle to determine the potential linkages among them for reactions involving free radicals. Our results in gas-phase and aqueous environment elucidate the most probable free radical resonance structures in monolignols. Their reactivity toward nucleophilic or electrophilic species was described applying the Fukui function after a second analysis of the selected resonance structures. Methodology herein described could differentiate the inherent nature of one radical from another.

  13. Influence in Canonical Correlation Analysis.

    ERIC Educational Resources Information Center

    Romanazzi, Mario

    1992-01-01

    The perturbation theory of the generalized eigenproblem is used to derive influence functions of each squared canonical correlation coefficient and the corresponding canonical vector pair. Three sample versions of these functions are described, and some properties are noted. Two obvious applications, multiple correlation and correspondence…

  14. Perioperative Assessment of Myocardial Deformation

    PubMed Central

    Duncan, Andra E.; Alfirevic, Andrej; Sessler, Daniel I.; Popovic, Zoran B.; Thomas, James D.

    2014-01-01

    Evaluation of left ventricular performance improves risk assessment and guides anesthetic decisions. However, the most common echocardiographic measure of myocardial function, the left ventricular ejection fraction (LVEF), has important limitations. LVEF is limited by subjective interpretation which reduces accuracy and reproducibility, and LVEF assesses global function without characterizing regional myocardial abnormalities. An alternative objective echocardiographic measure of myocardial function is thus needed. Myocardial deformation analysis, which performs quantitative assessment of global and regional myocardial function, may be useful for perioperative care of surgical patients. Myocardial deformation analysis evaluates left ventricular mechanics by quantifying strain and strain rate. Strain describes percent change in myocardial length in the longitudinal (from base to apex) and circumferential (encircling the short-axis of the ventricle) direction and change in thickness in the radial direction. Segmental strain describes regional myocardial function. Strain is a negative number when the ventricle shortens longitudinally or circumferentially and is positive with radial thickening. Reference values for normal longitudinal strain from a recent meta-analysis using transthoracic echocardiography are (mean ± SD) −19.7 ± 0.4%, while radial and circumferential strain are 47.3 ± 1.9 and −23.3 ± 0.7%, respectively. The speed of myocardial deformation is also important and is characterized by strain rate. Longitudinal systolic strain rate in healthy subjects averages −1.10 ± 0.16 sec−1. Assessment of myocardial deformation requires consideration of both strain (change in deformation), which correlates with LVEF, and strain rate (speed of deformation), which correlates with rate of rise of left ventricular pressure (dP/dt). Myocardial deformation analysis also evaluates ventricular relaxation, twist, and untwist, providing new and noninvasive methods to assess components of myocardial systolic and diastolic function. Myocardial deformation analysis is based on either Doppler or a non-Doppler technique, called speckle-tracking echocardiography. Myocardial deformation analysis provides quantitative measures of global and regional myocardial function for use in the perioperative care of the surgical patient. For example, coronary graft occlusion after coronary artery bypass grafting is detected by an acute reduction in strain in the affected coronary artery territory. In addition, assessment of left ventricular mechanics detects underlying myocardial pathology before abnormalities become apparent on conventional echocardiography. Certainly, patients with aortic regurgitation demonstrate reduced longitudinal strain before reduction in LVEF occurs, which allows detection of subclinical left ventricular dysfunction and predicts increased risk for heart failure and impaired myocardial function after surgical repair. In this review we describe the principles, techniques, and clinical application of myocardial deformation analysis. PMID:24557101

  15. "This Is a Message for …": Third Graders' Use of Written Text Functions to Facilitate Interpersonal Relationships

    ERIC Educational Resources Information Center

    Jaeger, Elizabeth L.

    2016-01-01

    This article describes the ways in which a class of 7- and 8-year-old children used writing to communicate. Using Halliday's Systemic Functional Linguistics as a theoretical frame, I examine what functions these messages served, how functions varied from child to child and how the practice of message-sending evolved over time. Analysis of data…

  16. A step-by-step guide to non-linear regression analysis of experimental data using a Microsoft Excel spreadsheet.

    PubMed

    Brown, A M

    2001-06-01

    The objective of this present study was to introduce a simple, easily understood method for carrying out non-linear regression analysis based on user input functions. While it is relatively straightforward to fit data with simple functions such as linear or logarithmic functions, fitting data with more complicated non-linear functions is more difficult. Commercial specialist programmes are available that will carry out this analysis, but these programmes are expensive and are not intuitive to learn. An alternative method described here is to use the SOLVER function of the ubiquitous spreadsheet programme Microsoft Excel, which employs an iterative least squares fitting routine to produce the optimal goodness of fit between data and function. The intent of this paper is to lead the reader through an easily understood step-by-step guide to implementing this method, which can be applied to any function in the form y=f(x), and is well suited to fast, reliable analysis of data in all fields of biology.

  17. Noncoding sequence classification based on wavelet transform analysis: part II

    NASA Astrophysics Data System (ADS)

    Paredes, O.; Strojnik, M.; Romo-Vázquez, R.; Vélez-Pérez, H.; Ranta, R.; Garcia-Torales, G.; Scholl, M. K.; Morales, J. A.

    2017-09-01

    DNA sequences in human genome can be divided into the coding and noncoding ones. We hypothesize that the characteristic periodicities of the noncoding sequences are related to their function. We describe the procedure to identify these characteristic periodicities using the wavelet analysis. Our results show that three groups of noncoding sequences, each one with different biological function, may be differentiated by their wavelet coefficients within specific frequency range.

  18. Describing-function analysis of a ripple regulator with slew-rate limits and time delays

    NASA Technical Reports Server (NTRS)

    Wester, Gene W.

    1990-01-01

    The effects of time delays and slew-rate limits on the steady-state operating points and performance of a free-running ripple regulator are evaluated using describing-function analysis. The describing function of an ideal comparator (no time delays or slew rate limits) has no phase shift and is independent of frequency. It is found that turn-on delay and turn-off delay have different effects on gain and phase and cannot be combined. Comparator hysteresis affects both gain and phase; likewise, time delays generally affect both gain and phase. It is found that the effective time delay around the feedback loop is one half the sum of turn-on and turn-off delays, regardless of whether the delays are caused by storage time or slew rate limits. Expressions are formulated for the switching frequency, switch duty ratio, dc output, and output ripple. For the case of no hysteresis, a simple, graphical solution for the switching frequency is possible, and the resulting switching frequency is independent of first-order variations of input or load.

  19. MicroScope-an integrated resource for community expertise of gene functions and comparative analysis of microbial genomic and metabolic data.

    PubMed

    Médigue, Claudine; Calteau, Alexandra; Cruveiller, Stéphane; Gachet, Mathieu; Gautreau, Guillaume; Josso, Adrien; Lajus, Aurélie; Langlois, Jordan; Pereira, Hugo; Planel, Rémi; Roche, David; Rollin, Johan; Rouy, Zoe; Vallenet, David

    2017-09-12

    The overwhelming list of new bacterial genomes becoming available on a daily basis makes accurate genome annotation an essential step that ultimately determines the relevance of thousands of genomes stored in public databanks. The MicroScope platform (http://www.genoscope.cns.fr/agc/microscope) is an integrative resource that supports systematic and efficient revision of microbial genome annotation, data management and comparative analysis. Starting from the results of our syntactic, functional and relational annotation pipelines, MicroScope provides an integrated environment for the expert annotation and comparative analysis of prokaryotic genomes. It combines tools and graphical interfaces to analyze genomes and to perform the manual curation of gene function in a comparative genomics and metabolic context. In this article, we describe the free-of-charge MicroScope services for the annotation and analysis of microbial (meta)genomes, transcriptomic and re-sequencing data. Then, the functionalities of the platform are presented in a way providing practical guidance and help to the nonspecialists in bioinformatics. Newly integrated analysis tools (i.e. prediction of virulence and resistance genes in bacterial genomes) and original method recently developed (the pan-genome graph representation) are also described. Integrated environments such as MicroScope clearly contribute, through the user community, to help maintaining accurate resources. © The Author 2017. Published by Oxford University Press.

  20. A mathematical model to describe the nonlinear elastic properties of the gastrocnemius tendon of chickens.

    PubMed

    Foutz, T L

    1991-03-01

    A phenomenological model was developed to describe the nonlinear elastic behavior of the avian gastrocnemius tendon. Quasistatic uniaxial tensile tests were used to apply a deformation and resulting load on the tendon at a deformation rate of 5 mm/min. Plots of deformation versus load indicated a nonlinear loading response. By calculating engineering stress and engineering strain, the experimental data were normalized for tendon shape. The elastic response was determined from stress-strain curves and was found to vary with engineering strain. The response to the applied engineering strain could best be described by a mathematical model that combined a linear function and a nonlinear function. Three parameters in the model were developed to represent the nonlinear elastic behavior of the tendon, thereby allowing analysis of elasticity without prior knowledge of engineering strain. This procedure reduced the amount of data needed for the statistical analysis of nonlinear elasticity.

  1. IQM: An Extensible and Portable Open Source Application for Image and Signal Analysis in Java

    PubMed Central

    Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut

    2015-01-01

    Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM’s image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis. PMID:25612319

  2. IQM: an extensible and portable open source application for image and signal analysis in Java.

    PubMed

    Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut

    2015-01-01

    Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.

  3. An improved viscous characteristics analysis program

    NASA Technical Reports Server (NTRS)

    Jenkins, R. V.

    1978-01-01

    An improved two dimensional characteristics analysis program is presented. The program is built upon the foundation of a FORTRAN program entitled Analysis of Supersonic Combustion Flow Fields With Embedded Subsonic Regions. The major improvements are described and a listing of the new program is provided. The subroutines and their functions are given as well as the input required for the program. Several applications of the program to real problems are qualitatively described. Three runs obtained in the investigation of a real problem are presented to provide insight for the input and output of the program.

  4. User's manual for the Shuttle Electric Power System analysis computer program (SEPS), volume 2 of program documentation

    NASA Technical Reports Server (NTRS)

    Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.

    1974-01-01

    The Shuttle Electric Power System Analysis SEPS computer program which performs detailed load analysis including predicting energy demands and consumables requirements of the shuttle electric power system along with parameteric and special case studies on the shuttle electric power system is described. The functional flow diagram of the SEPS program is presented along with data base requirements and formats, procedure and activity definitions, and mission timeline input formats. Distribution circuit input and fixed data requirements are included. Run procedures and deck setups are described.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    MITCHELL,GERRY W.; LONGLEY,SUSAN W.; PHILBIN,JEFFREY S.

    This Safety Analysis Report (SAR) is prepared in compliance with the requirements of DOE Order 5480.23, Nuclear Safety Analysis Reports, and has been written to the format and content guide of DOE-STD-3009-94 Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Safety Analysis Reports. The Hot Cell Facility is a Hazard Category 2 nonreactor nuclear facility, and is operated by Sandia National Laboratories for the Department of Energy. This SAR provides a description of the HCF and its operations, an assessment of the hazards and potential accidents which may occur in the facility. The potential consequences and likelihood ofmore » these accidents are analyzed and described. Using the process and criteria described in DOE-STD-3009-94, safety-related structures, systems and components are identified, and the important safety functions of each SSC are described. Additionally, information which describes the safety management programs at SNL are described in ancillary chapters of the SAR.« less

  6. Comments on Skinner's grammar

    PubMed Central

    Mabry, John H.

    1993-01-01

    The strong tradition of “school room” grammars may have had a negative influence on the reception given a functional analysis of verbal behavior, both within and without the field of behavior analysis. Some of the failings of those traditional grammars, and their largely prescriptive nature were outlined through reference to other critics, and conflicting views. Skinner's own treatment of grammatical issues was presented, emphasizing his view of a functional unit and his use of the autoclitic and intraverbal functions to describe alternatives to a formal or structural analysis. Finally, the relevance of stimulus control variables to some recurring questions about verbal behavior and, specifically grammar, were mentioned. PMID:22477082

  7. Solid Waste Program technical baseline description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  8. Current Work on Telecommunications Policies and Structures.

    ERIC Educational Resources Information Center

    Ohlin, Thomas

    The studies described in this paper were undertaken to evaluate the usefulness of information-exchange techniques for promoting the quality of life within given regions of a society. The first analysis describes the investigation of projects to enhance the functions of small businesses, joint regional planning, health care, education,…

  9. 76 FR 72493 - ITS Joint Program Office Webinar on Alternative Organizational Structures for a Certificate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... over time. (This study is an institutional analysis only, not a technical analysis, and it is not... Adam Hopps at (202) 680-0091. The ITS JPO will present results from an early analysis of organizational models. This analysis will describe the functions that need to be performed by a CME; identify key...

  10. Technical Advance: Live-imaging analysis of human dendritic cell migrating behavior under the influence of immune-stimulating reagents in an organotypic model of lung

    PubMed Central

    Nguyen Hoang, Anh Thu; Chen, Puran; Björnfot, Sofia; Högstrand, Kari; Lock, John G.; Grandien, Alf; Coles, Mark; Svensson, Mattias

    2014-01-01

    This manuscript describes technical advances allowing manipulation and quantitative analyses of human DC migratory behavior in lung epithelial tissue. DCs are hematopoietic cells essential for the maintenance of tissue homeostasis and the induction of tissue-specific immune responses. Important functions include cytokine production and migration in response to infection for the induction of proper immune responses. To design appropriate strategies to exploit human DC functional properties in lung tissue for the purpose of clinical evaluation, e.g., candidate vaccination and immunotherapy strategies, we have developed a live-imaging assay based on our previously described organotypic model of the human lung. This assay allows provocations and subsequent quantitative investigations of DC functional properties under conditions mimicking morphological and functional features of the in vivo parental tissue. We present protocols to set up and prepare tissue models for 4D (x, y, z, time) fluorescence-imaging analysis that allow spatial and temporal studies of human DCs in live epithelial tissue, followed by flow cytometry analysis of DCs retrieved from digested tissue models. This model system can be useful for elucidating incompletely defined pathways controlling DC functional responses to infection and inflammation in lung epithelial tissue, as well as the efficacy of locally administered candidate interventions. PMID:24899587

  11. Simulation analysis of a microcomputer-based, low-cost Omega navigation system

    NASA Technical Reports Server (NTRS)

    Lilley, R. W.; Salter, R. J., Jr.

    1976-01-01

    The current status of research on a proposed micro-computer-based, low-cost Omega Navigation System (ONS) is described. The design approach emphasizes minimum hardware, maximum software, and the use of a low-cost, commercially-available microcomputer. Currently under investigation is the implementation of a low-cost navigation processor and its interface with an omega sensor to complete the hardware-based ONS. Sensor processor functions are simulated to determine how many of the sensor processor functions can be handled by innovative software. An input data base of live Omega ground and flight test data was created. The Omega sensor and microcomputer interface modules used to collect the data are functionally described. Automatic synchronization to the Omega transmission pattern is described as an example of the algorithms developed using this data base.

  12. Quasielastic charged-current neutrino scattering in the scaling model with relativistic effective mass

    NASA Astrophysics Data System (ADS)

    Ruiz Simo, I.; Martinez-Consentino, V. L.; Amaro, J. E.; Ruiz Arriola, E.

    2018-06-01

    We use a recent scaling analysis of the quasielastic electron scattering data from C 12 to predict the quasielastic charge-changing neutrino scattering cross sections within an uncertainty band. We use a scaling function extracted from a selection of the (e ,e') cross section data, and an effective nucleon mass inspired by the relativistic mean-field model of nuclear matter. The corresponding superscaling analysis with relativistic effective mass (SuSAM*) describes a large amount of the electron data lying inside a phenomenological quasielastic band. The effective mass incorporates the enhancement of the transverse current produced by the relativistic mean field. The scaling function incorporates nuclear effects beyond the impulse approximation, in particular meson-exchange currents and short-range correlations producing tails in the scaling function. Besides its simplicity, this model describes the neutrino data as reasonably well as other more sophisticated nuclear models.

  13. Functional Data Analysis in NTCP Modeling: A New Method to Explore the Radiation Dose-Volume Effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benadjaoud, Mohamed Amine, E-mail: mohamedamine.benadjaoud@gustaveroussy.fr; Université Paris sud, Le Kremlin-Bicêtre; Institut Gustave Roussy, Villejuif

    2014-11-01

    Purpose/Objective(s): To describe a novel method to explore radiation dose-volume effects. Functional data analysis is used to investigate the information contained in differential dose-volume histograms. The method is applied to the normal tissue complication probability modeling of rectal bleeding (RB) for patients irradiated in the prostatic bed by 3-dimensional conformal radiation therapy. Methods and Materials: Kernel density estimation was used to estimate the individual probability density functions from each of the 141 rectum differential dose-volume histograms. Functional principal component analysis was performed on the estimated probability density functions to explore the variation modes in the dose distribution. The functional principalmore » components were then tested for association with RB using logistic regression adapted to functional covariates (FLR). For comparison, 3 other normal tissue complication probability models were considered: the Lyman-Kutcher-Burman model, logistic model based on standard dosimetric parameters (LM), and logistic model based on multivariate principal component analysis (PCA). Results: The incidence rate of grade ≥2 RB was 14%. V{sub 65Gy} was the most predictive factor for the LM (P=.058). The best fit for the Lyman-Kutcher-Burman model was obtained with n=0.12, m = 0.17, and TD50 = 72.6 Gy. In PCA and FLR, the components that describe the interdependence between the relative volumes exposed at intermediate and high doses were the most correlated to the complication. The FLR parameter function leads to a better understanding of the volume effect by including the treatment specificity in the delivered mechanistic information. For RB grade ≥2, patients with advanced age are significantly at risk (odds ratio, 1.123; 95% confidence interval, 1.03-1.22), and the fits of the LM, PCA, and functional principal component analysis models are significantly improved by including this clinical factor. Conclusion: Functional data analysis provides an attractive method for flexibly estimating the dose-volume effect for normal tissues in external radiation therapy.« less

  14. Functional analysis and treatment of the diurnal bruxism of a 16-year-old girl with autism.

    PubMed

    Armstrong, Amy; Knapp, Vicki Madaus; McAdam, David B

    2014-01-01

    Bruxism is defined as the clenching and grinding of teeth. This study used a functional analysis to examine whether the bruxism of a 16-year-old girl with autism was maintained by automatic reinforcement or social consequences. A subsequent component analysis of the intervention package described by Barnoy, Najdowski, Tarbox, Wilke, and Nollet (2009) showed that a vocal reprimand (e.g., "stop grinding") effectively reduced the participant's bruxism. Results were maintained across time, and effects extended to novel staff members. © Society for the Experimental Analysis of Behavior.

  15. An analysis of a nonlinear instability in the implementation of a VTOL control system

    NASA Technical Reports Server (NTRS)

    Weber, J. M.

    1982-01-01

    The contributions to nonlinear behavior and unstable response of the model following yaw control system of a VTOL aircraft during hover were determined. The system was designed as a state rate feedback implicit model follower that provided yaw rate command/heading hold capability and used combined full authority parallel and limited authority series servo actuators to generate an input to the yaw reaction control system of the aircraft. Both linear and nonlinear system models, as well as describing function linearization techniques were used to determine the influence on the control system instability of input magnitude and bandwidth, series servo authority, and system bandwidth. Results of the analysis describe stability boundaries as a function of these system design characteristics.

  16. Acquisition of Picture Exchange-Based vs. Signed Mands and Implications to Teach Functional Communication Skills to Children with Autism

    ERIC Educational Resources Information Center

    Nam, Sang S.; Hwang, Young S.

    2016-01-01

    A literature review was conducted to describe important concepts involved in functional analysis of verbal behavior as well as to evaluate empirical research findings on acquisition of picture exchange-based vs. signed mands to suggest instructional implications for teachers and therapists to teach functional communication skills to children with…

  17. Impact of Missing Data on the Detection of Differential Item Functioning: The Case of Mantel-Haenszel and Logistic Regression Analysis

    ERIC Educational Resources Information Center

    Robitzsch, Alexander; Rupp, Andre A.

    2009-01-01

    This article describes the results of a simulation study to investigate the impact of missing data on the detection of differential item functioning (DIF). Specifically, it investigates how four methods for dealing with missing data (listwise deletion, zero imputation, two-way imputation, response function imputation) interact with two methods of…

  18. Functional Profiling Using the Saccharomyces Genome Deletion Project Collections.

    PubMed

    Nislow, Corey; Wong, Lai Hong; Lee, Amy Huei-Yi; Giaever, Guri

    2016-09-01

    The ability to measure and quantify the fitness of an entire organism requires considerably more complex approaches than simply using traditional "omic" methods that examine, for example, the abundance of RNA transcripts, proteins, or metabolites. The yeast deletion collections represent the only systematic, comprehensive set of null alleles for any organism in which such fitness measurements can be assayed. Generated by the Saccharomyces Genome Deletion Project, these collections allow the systematic and parallel analysis of gene functions using any measurable phenotype. The unique 20-bp molecular barcodes engineered into the genome of each deletion strain facilitate the massively parallel analysis of individual fitness. Here, we present functional genomic protocols for use with the yeast deletion collections. We describe how to maintain, propagate, and store the deletion collections and how to perform growth fitness assays on single and parallel screening platforms. Phenotypic fitness analyses of the yeast mutants, described in brief here, provide important insights into biological functions, mechanisms of drug action, and response to environmental stresses. It is important to bear in mind that the specific assays described in this protocol represent some of the many ways in which these collections can be assayed, and in this description particular attention is paid to maximizing throughput using growth as the phenotypic measure. © 2016 Cold Spring Harbor Laboratory Press.

  19. FBI fingerprint identification automation study: AIDS 3 evaluation report. Volume 7: Top down functional analysis

    NASA Technical Reports Server (NTRS)

    Mulhall, B. D. L.

    1980-01-01

    The functions are identified and described in chart form as a tree in which the basic functions, to 'Provide National Identification Service,' are shown at the top. The lower levels of the tree branch out to indicate functions and sub-functions. Symbols are used to indicate whether or not a function was automated in the AIDS 1 or 2 system or is planned to be automated in the AIDS 3 system. The tree chart is shown in detail.

  20. Functional Analysis of Metabolomics Data.

    PubMed

    Chagoyen, Mónica; López-Ibáñez, Javier; Pazos, Florencio

    2016-01-01

    Metabolomics aims at characterizing the repertory of small chemical compounds in a biological sample. As it becomes more massive and larger sets of compounds are detected, a functional analysis is required to convert these raw lists of compounds into biological knowledge. The most common way of performing such analysis is "annotation enrichment analysis," also used in transcriptomics and proteomics. This approach extracts the annotations overrepresented in the set of chemical compounds arisen in a given experiment. Here, we describe the protocols for performing such analysis as well as for visualizing a set of compounds in different representations of the metabolic networks, in both cases using free accessible web tools.

  1. Patients suffering from rheumatic disease describing own experiences from participating in Basic Body Awareness Group Therapy: A qualitative pilot study.

    PubMed

    Olsen, Aarid Liland; Skjaerven, Liv Helvik

    2016-01-01

    Rheumatic diseases have physical and psychological impact on patients' movement and function. Basic Body Awareness Therapy focuses on promoting more functional movement quality in daily life. The purpose of this study was to describe patient experiences from participating in Basic Body Awareness Group Therapy for inpatients with rheumatic disease. A phenomenological design included data collection in two focus group interviews with seven patients. Giorgi's four-step phenomenological method was used for data analysis. Four main themes emerged: (1) "Movement exploration-being guided in movement" described informants' exploration of bodily signals and movement habits; (2) "Movement awareness training in a relational perspective" informants described experiences from being in a group setting; (3) "Movement awareness-integration and insight" described informants' reflections on movement experiences; and (4) "Integrating and practicing new movement habits" informants described how they used their movement experiences in daily life. The study described perspectives in movement learning experienced by patients. The results support the view that contact with the body can help patients exploring and cultivating their own resources for a more functional movement quality. Descriptions of relational movement learning aspects can contribute to our understanding of physiotherapy group design.

  2. Using PATIMDB to Create Bacterial Transposon Insertion Mutant Libraries

    PubMed Central

    Urbach, Jonathan M.; Wei, Tao; Liberati, Nicole; Grenfell-Lee, Daniel; Villanueva, Jacinto; Wu, Gang; Ausubel, Frederick M.

    2015-01-01

    PATIMDB is a software package for facilitating the generation of transposon mutant insertion libraries. The software has two main functions: process tracking and automated sequence analysis. The process tracking function specifically includes recording the status and fates of multiwell plates and samples in various stages of library construction. Automated sequence analysis refers specifically to the pipeline of sequence analysis starting with ABI files from a sequencing facility and ending with insertion location identifications. The protocols in this unit describe installation and use of PATIMDB software. PMID:19343706

  3. HSI top-down requirements analysis for ship manpower reduction

    NASA Astrophysics Data System (ADS)

    Malone, Thomas B.; Bost, J. R.

    2000-11-01

    U.S. Navy ship acquisition programs such as DD 21 and CVNX are increasingly relying on top down requirements analysis (TDRA) to define and assess design approaches for workload and manpower reduction, and for ensuring required levels of human performance, reliability, safety, and quality of life at sea. The human systems integration (HSI) approach to TDRA begins with a function analysis which identifies the functions derived from the requirements in the Operational Requirements Document (ORD). The function analysis serves as the function baseline for the ship, and also supports the definition of RDT&E and Total Ownership Cost requirements. A mission analysis is then conducted to identify mission scenarios, again based on requirements in the ORD, and the Design Reference Mission (DRM). This is followed by a mission/function analysis which establishes the function requirements to successfully perform the ship's missions. Function requirements of major importance for HSI are information, performance, decision, and support requirements associated with each function. An allocation of functions defines the roles of humans and automation in performing the functions associated with a mission. Alternate design concepts, based on function allocation strategies, are then described, and task networks associated with the concepts are developed. Task network simulations are conducted to assess workloads and human performance capabilities associated with alternate concepts. An assessment of the affordability and risk associated with alternate concepts is performed, and manning estimates are developed for feasible design concepts.

  4. Interoperability-oriented Integration of Failure Knowledge into Functional Knowledge and Knowledge Transformation based on Concepts Mapping

    NASA Astrophysics Data System (ADS)

    Koji, Yusuke; Kitamura, Yoshinobu; Kato, Yoshikiyo; Tsutsui, Yoshio; Mizoguchi, Riichiro

    In conceptual design, it is important to develop functional structures which reflect the rich experience in the knowledge from previous design failures. Especially, if a designer learns possible abnormal behaviors from a previous design failure, he or she can add an additional function which prevents such abnormal behaviors and faults. To do this, it is a crucial issue to share such knowledge about possible faulty phenomena and how to cope with them. In fact, a part of such knowledge is described in FMEA (Failure Mode and Effect Analysis) sheets, function structure models for systematic design and fault trees for FTA (Fault Tree Analysis).

  5. Game Building with Complex-Valued Functions

    ERIC Educational Resources Information Center

    Dittman, Marki; Soto-Johnson, Hortensia; Dickinson, Scott; Harr, Tim

    2017-01-01

    In this paper, we describe how we integrated complex analysis into the second semester of a geometry course designed for preservice secondary mathematics teachers. As part of this inquiry-based course, the preservice teachers incorporated their geometric understanding of the arithmetic of complex numbers and complex-valued functions to create a…

  6. Researching Learner Self-Efficacy and Online Participation through Speech Functions: An Exploratory Study

    ERIC Educational Resources Information Center

    Sánchez-Castro, Olga; Strambi, Antonella

    2017-01-01

    This study explores the potential contribution of Eggins and Slade's (2004) Speech Functions as tools for describing learners' participation patterns in Synchronous Computer-Mediated Communication (SCMC). Our analysis focuses on the relationship between learners' self-efficacy (i.e. personal judgments of second language performance capabilities)…

  7. A Functional Analysis of Teachers' Instructions

    ERIC Educational Resources Information Center

    Todd, Richard Watson; Chaiyasuk, Intisarn; Tantisawetrat, Nuantip

    2008-01-01

    Instructions are an under-researched aspect of classroom discourse. In this paper, we attempt to describe the functional structure of teacher instructions using the framework proposed by Sinclair and Coulthard (1975). We examine nine directing transactions or sets of instructions from four lessons taught on an English for Academic Purposes course…

  8. Simulated Cardiopulmonary Arrests in a Hospital Setting.

    ERIC Educational Resources Information Center

    Mishkin, Barbara H.; And Others

    1982-01-01

    Describes a simulated interdisciplinary role rehearsal for cardiopulmonary arrest to prepare nurses to function effectively. Includes needs analysis, program components, and responses of program participants. (Author)

  9. Methods utilized in evaluating the profitability of commercial space processing

    NASA Technical Reports Server (NTRS)

    Bloom, H. L.; Schmitt, P. T.

    1976-01-01

    Profitability analysis is applied to commercial space processing on the basis of business concept definition and assessment and the relationship between ground and space functions. Throughput analysis is demonstrated by analysis of the space manufacturing of surface acoustic wave devices. The paper describes a financial analysis model for space processing and provides key profitability measures for space processed isoenzymes.

  10. SWMPr: An R Package for Retrieving, Organizing, and ...

    EPA Pesticide Factsheets

    The System-Wide Monitoring Program (SWMP) was implemented in 1995 by the US National Estuarine Research Reserve System. This program has provided two decades of continuous monitoring data at over 140 fixed stations in 28 estuaries. However, the increasing quantity of data provided by the monitoring network has complicated broad-scale comparisons between systems and, in some cases, prevented simple trend analysis of water quality parameters at individual sites. This article describes the SWMPr package that provides several functions that facilitate data retrieval, organization, andanalysis of time series data in the reserve estuaries. Previously unavailable functions for estuaries are also provided to estimate rates of ecosystem metabolism using the open-water method. The SWMPr package has facilitated a cross-reserve comparison of water quality trends and links quantitative information with analysis tools that have use for more generic applications to environmental time series. The manuscript describes a software package that was recently developed to retrieve, organize, and analyze monitoring data from the National Estuarine Research Reserve System. Functions are explained in detail, including recent applications for trend analysis of ecosystem metabolism.

  11. Cochlear Processes: A Research Report.

    ERIC Educational Resources Information Center

    Zwislocki, Jozef J.

    This paper summarizes recent research on functions of the cochlea of the inner ear. The cochlea is described as the seat of the first step in the auditory sound analysis and transduction of mechanical vibration into electrochemical processes leading to the generation of neural action potentials. The cochlea is also described as a frequent seat of…

  12. Use of stable isotope analysis in determining aquatic food webs

    EPA Science Inventory

    Stable isotope analysis is a useful tool for describing resource-consumer dynamics in ecosystems. In general, organisms of a given trophic level or functional feeding group will have a stable isotope ratio identifiable different than their prey because of preferential use of one ...

  13. Design oriented structural analysis

    NASA Technical Reports Server (NTRS)

    Giles, Gary L.

    1994-01-01

    Desirable characteristics and benefits of design oriented analysis methods are described and illustrated by presenting a synoptic description of the development and uses of the Equivalent Laminated Plate Solution (ELAPS) computer code. ELAPS is a design oriented structural analysis method which is intended for use in the early design of aircraft wing structures. Model preparation is minimized by using a few large plate segments to model the wing box structure. Computational efficiency is achieved by using a limited number of global displacement functions that encompass all segments over the wing planform. Coupling with other codes is facilitated since the output quantities such as deflections and stresses are calculated as continuous functions over the plate segments. Various aspects of the ELAPS development are discussed including the analytical formulation, verification of results by comparison with finite element analysis results, coupling with other codes, and calculation of sensitivity derivatives. The effectiveness of ELAPS for multidisciplinary design application is illustrated by describing its use in design studies of high speed civil transport wing structures.

  14. Analyzing coastal environments by means of functional data analysis

    NASA Astrophysics Data System (ADS)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  15. Integrating computer programs for engineering analysis and design

    NASA Technical Reports Server (NTRS)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  16. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  17. Computer program for supersonic Kernel-function flutter analysis of thin lifting surfaces

    NASA Technical Reports Server (NTRS)

    Cunningham, H. J.

    1974-01-01

    This report describes a computer program (program D2180) that has been prepared to implement the analysis described in (N71-10866) for calculating the aerodynamic forces on a class of harmonically oscillating planar lifting surfaces in supersonic potential flow. The planforms treated are the delta and modified-delta (arrowhead) planforms with subsonic leading and supersonic trailing edges, and (essentially) pointed tips. The resulting aerodynamic forces are applied in a Galerkin modal flutter analysis. The required input data are the flow and planform parameters including deflection-mode data, modal frequencies, and generalized masses.

  18. Fungal proteomics: from identification to function.

    PubMed

    Doyle, Sean

    2011-08-01

    Some fungi cause disease in humans and plants, while others have demonstrable potential for the control of insect pests. In addition, fungi are also a rich reservoir of therapeutic metabolites and industrially useful enzymes. Detailed analysis of fungal biochemistry is now enabled by multiple technologies including protein mass spectrometry, genome and transcriptome sequencing and advances in bioinformatics. Yet, the assignment of function to fungal proteins, encoded either by in silico annotated, or unannotated genes, remains problematic. The purpose of this review is to describe the strategies used by many researchers to reveal protein function in fungi, and more importantly, to consolidate the nomenclature of 'unknown function protein' as opposed to 'hypothetical protein' - once any protein has been identified by protein mass spectrometry. A combination of approaches including comparative proteomics, pathogen-induced protein expression and immunoproteomics are outlined, which, when used in combination with a variety of other techniques (e.g. functional genomics, microarray analysis, immunochemical and infection model systems), appear to yield comprehensive and definitive information on protein function in fungi. The relative advantages of proteomic, as opposed to transcriptomic-only, analyses are also described. In the future, combined high-throughput, quantitative proteomics, allied to transcriptomic sequencing, are set to reveal much about protein function in fungi. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  19. Protein arginine methylation: Cellular functions and methods of analysis.

    PubMed

    Pahlich, Steffen; Zakaryan, Rouzanna P; Gehring, Heinz

    2006-12-01

    During the last few years, new members of the growing family of protein arginine methyltransferases (PRMTs) have been identified and the role of arginine methylation in manifold cellular processes like signaling, RNA processing, transcription, and subcellular transport has been extensively investigated. In this review, we describe recent methods and findings that have yielded new insights into the cellular functions of arginine-methylated proteins, and we evaluate the currently used procedures for the detection and analysis of arginine methylation.

  20. Scalable analysis of nonlinear systems using convex optimization

    NASA Astrophysics Data System (ADS)

    Papachristodoulou, Antonis

    In this thesis, we investigate how convex optimization can be used to analyze different classes of nonlinear systems at various scales algorithmically. The methodology is based on the construction of appropriate Lyapunov-type certificates using sum of squares techniques. After a brief introduction on the mathematical tools that we will be using, we turn our attention to robust stability and performance analysis of systems described by Ordinary Differential Equations. A general framework for constrained systems analysis is developed, under which stability of systems with polynomial, non-polynomial vector fields and switching systems, as well estimating the region of attraction and the L2 gain can be treated in a unified manner. We apply our results to examples from biology and aerospace. We then consider systems described by Functional Differential Equations (FDEs), i.e., time-delay systems. Their main characteristic is that they are infinite dimensional, which complicates their analysis. We first show how the complete Lyapunov-Krasovskii functional can be constructed algorithmically for linear time-delay systems. Then, we concentrate on delay-independent and delay-dependent stability analysis of nonlinear FDEs using sum of squares techniques. An example from ecology is given. The scalable stability analysis of congestion control algorithms for the Internet is investigated next. The models we use result in an arbitrary interconnection of FDE subsystems, for which we require that stability holds for arbitrary delays, network topologies and link capacities. Through a constructive proof, we develop a Lyapunov functional for FAST---a recently developed network congestion control scheme---so that the Lyapunov stability properties scale with the system size. We also show how other network congestion control schemes can be analyzed in the same way. Finally, we concentrate on systems described by Partial Differential Equations. We show that axially constant perturbations of the Navier-Stokes equations for Hagen-Poiseuille flow are globally stable, even though the background noise is amplified as R3 where R is the Reynolds number, giving a 'robust yet fragile' interpretation. We also propose a sum of squares methodology for the analysis of systems described by parabolic PDEs. We conclude this work with an account for future research.

  1. A first approach to the distortion analysis of nonlinear analog circuits utilizing X-parameters

    NASA Astrophysics Data System (ADS)

    Weber, H.; Widemann, C.; Mathis, W.

    2013-07-01

    In this contribution a first approach to the distortion analysis of nonlinear 2-port-networks with X-parameters1 is presented. The X-parameters introduced by Verspecht and Root (2006) offer the possibility to describe nonlinear microwave 2-port-networks under large signal conditions. On the basis of X-parameter measurements with a nonlinear network analyzer (NVNA) behavioral models can be extracted for the networks. These models can be used to consider the nonlinear behavior during the design process of microwave circuits. The idea of the present work is to extract the behavioral models in order to describe the influence of interfering signals on the output behavior of the nonlinear circuits. Hereby, a simulator is used instead of a NVNA to extract the X-parameters. Assuming that the interfering signals are relatively small compared to the nominal input signal, the output signal can be described as a superposition of the effects of each input signal. In order to determine the functional correlation between the scattering variables, a polynomial dependency is assumed. The required datasets for the approximation of the describing functions are simulated by a directional coupler model in Cadence Design Framework. The polynomial coefficients are obtained by a least-square method. The resulting describing functions can be used to predict the system's behavior under certain conditions as well as the effects of the interfering signal on the output signal. 1 X-parameter is a registered trademark of Agilent Technologies, Inc.

  2. Pedotransfer functions: bridging the gap between available basic soil data and missing soil hydraulic characteristics

    NASA Astrophysics Data System (ADS)

    Wösten, J. H. M.; Pachepsky, Ya. A.; Rawls, W. J.

    2001-10-01

    Water retention and hydraulic conductivity are crucial input parameters in any modelling study on water flow and solute transport in soils. Due to inherent temporal and spatial variability in these hydraulic characteristics, large numbers of samples are required to properly characterise areas of land. Hydraulic characteristics can be obtained from direct laboratory and field measurements. However, these measurements are time consuming which makes it costly to characterise an area of land. As an alternative, analysis of existing databases of measured soil hydraulic data may result in pedotransfer functions. In practise, these functions often prove to be good predictors for missing soil hydraulic characteristics. Examples are presented of different equations describing hydraulic characteristics and of pedotransfer functions used to predict parameters in these equations. Grouping of data prior to pedotransfer function development is discussed as well as the use of different soil properties as predictors. In addition to regression analysis, new techniques such as artificial neural networks, group methods of data handling, and classification and regression trees are increasingly being used for pedotransfer function development. Actual development of pedotransfer functions is demonstrated by describing a practical case study. Examples are presented of pedotransfer function for predicting other than hydraulic characteristics. Accuracy and reliability of pedotransfer functions are demonstrated and discussed. In this respect, functional evaluation of pedotransfer functions proves to be a good tool to assess the desired accuracy of a pedotransfer function for a specific application.

  3. Study of space shuttle orbiter system management computer function. Volume 1: Analysis, baseline design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.

  4. Log Normal Distribution of Cellular Uptake of Radioactivity: Statistical Analysis of Alpha Particle Track Autoradiography

    PubMed Central

    Neti, Prasad V.S.V.; Howell, Roger W.

    2008-01-01

    Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log normal distribution function (J Nucl Med 47, 6 (2006) 1049-1058) with the aid of an autoradiographic approach. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analyses of these data. Methods The measured distributions of alpha particle tracks per cell were subjected to statistical tests with Poisson (P), log normal (LN), and Poisson – log normal (P – LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL 210Po-citrate. When cells were exposed to 67 kBq/mL, the P – LN distribution function gave a better fit, however, the underlying activity distribution remained log normal. Conclusions The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:16741316

  5. Dimensional analysis yields the general second-order differential equation underlying many natural phenomena: the mathematical properties of a phenomenon's data plot then specify a unique differential equation for it.

    PubMed

    Kepner, Gordon R

    2014-08-27

    This study uses dimensional analysis to derive the general second-order differential equation that underlies numerous physical and natural phenomena described by common mathematical functions. It eschews assumptions about empirical constants and mechanisms. It relies only on the data plot's mathematical properties to provide the conditions and constraints needed to specify a second-order differential equation that is free of empirical constants for each phenomenon. A practical example of each function is analyzed using the general form of the underlying differential equation and the observable unique mathematical properties of each data plot, including boundary conditions. This yields a differential equation that describes the relationship among the physical variables governing the phenomenon's behavior. Complex phenomena such as the Standard Normal Distribution, the Logistic Growth Function, and Hill Ligand binding, which are characterized by data plots of distinctly different sigmoidal character, are readily analyzed by this approach. It provides an alternative, simple, unifying basis for analyzing each of these varied phenomena from a common perspective that ties them together and offers new insights into the appropriate empirical constants for describing each phenomenon.

  6. Technical advance: live-imaging analysis of human dendritic cell migrating behavior under the influence of immune-stimulating reagents in an organotypic model of lung.

    PubMed

    Nguyen Hoang, Anh Thu; Chen, Puran; Björnfot, Sofia; Högstrand, Kari; Lock, John G; Grandien, Alf; Coles, Mark; Svensson, Mattias

    2014-09-01

    This manuscript describes technical advances allowing manipulation and quantitative analyses of human DC migratory behavior in lung epithelial tissue. DCs are hematopoietic cells essential for the maintenance of tissue homeostasis and the induction of tissue-specific immune responses. Important functions include cytokine production and migration in response to infection for the induction of proper immune responses. To design appropriate strategies to exploit human DC functional properties in lung tissue for the purpose of clinical evaluation, e.g., candidate vaccination and immunotherapy strategies, we have developed a live-imaging assay based on our previously described organotypic model of the human lung. This assay allows provocations and subsequent quantitative investigations of DC functional properties under conditions mimicking morphological and functional features of the in vivo parental tissue. We present protocols to set up and prepare tissue models for 4D (x, y, z, time) fluorescence-imaging analysis that allow spatial and temporal studies of human DCs in live epithelial tissue, followed by flow cytometry analysis of DCs retrieved from digested tissue models. This model system can be useful for elucidating incompletely defined pathways controlling DC functional responses to infection and inflammation in lung epithelial tissue, as well as the efficacy of locally administered candidate interventions. © 2014 Society for Leukocyte Biology.

  7. Requirements management and control

    NASA Technical Reports Server (NTRS)

    Robbins, Red

    1993-01-01

    The systems engineering process for thermal nuclear propulsion requirements and configuration definition is described in outline and graphic form. Functional analysis and mission attributes for a Mars exploration mission are also addressed.

  8. Physical and Psychological Correlates of Disability among a Cohort of Individuals with Knee Osteoarthritis

    ERIC Educational Resources Information Center

    Marks, Ray

    2007-01-01

    While the physical correlates of knee osteoarthritis are well documented, less well documented are aspects of psychological functioning that may affect overall health and functional status. This paper describes the findings of a cross-sectional analysis that examined the strength of the relationship between selected psychological factors and the…

  9. Functional Skills of Individuals with Fragile X Syndrome: A Lifespan Cross-Sectional Analysis

    ERIC Educational Resources Information Center

    Bailey, Donald B., Jr.; Raspa, Melissa; Holiday, David; Bishop, Ellen; Olmsted, Murrey

    2009-01-01

    Parents of 1,105 male and 283 female children with fragile X syndrome described functional skill attainment in eating, dressing, toileting, bathing/hygiene, communication, articulation, and reading. The majority of adult children had mastered many skills independently. Most adults were verbal, used the toilet, dressed, ate independently, bathed,…

  10. Dysfunctions in Reading Disability: There's More than Meets the Eye.

    ERIC Educational Resources Information Center

    Fisher, Dennis F.

    Some basic pattern-analyzing functions that occur during the reading process are described in this paper. The functions deal mainly with the analysis of typographical factors such as word shape, spacing, and orientation, but they also interact with contextual variables. The research interpreted in the paper proposes an attentional model of reading…

  11. Demographic Accounting and Model-Building. Education and Development Technical Reports.

    ERIC Educational Resources Information Center

    Stone, Richard

    This report describes and develops a model for coordinating a variety of demographic and social statistics within a single framework. The framework proposed, together with its associated methods of analysis, serves both general and specific functions. The general aim of these functions is to give numerical definition to the pattern of society and…

  12. The Impact of Missing Data on the Detection of Nonuniform Differential Item Functioning

    ERIC Educational Resources Information Center

    Finch, W. Holmes

    2011-01-01

    Missing information is a ubiquitous aspect of data analysis, including responses to items on cognitive and affective instruments. Although the broader statistical literature describes missing data methods, relatively little work has focused on this issue in the context of differential item functioning (DIF) detection. Such prior research has…

  13. A discrimlnant function approach to ecological site classification in northern New England

    Treesearch

    James M. Fincher; Marie-Louise Smith

    1994-01-01

    Describes one approach to ecologically based classification of upland forest community types of the White and Green Mountain physiographic regions. The classification approach is based on an intensive statistical analysis of the relationship between the communities and soil-site factors. Discriminant functions useful in distinguishing between types based on soil-site...

  14. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  15. Structural and functional analysis of the ASM p.Ala359Asp mutant that causes acid sphingomyelinase deficiency.

    PubMed

    Acuña, Mariana; Castro-Fernández, Víctor; Latorre, Mauricio; Castro, Juan; Schuchman, Edward H; Guixé, Victoria; González, Mauricio; Zanlungo, Silvana

    2016-10-21

    Niemann-Pick disease (NPD) type A and B are recessive hereditary disorders caused by deficiency in acid sphingomyelinase (ASM). The p.Ala359Asp mutation has been described in several patients but its functional and structural effects in the protein are unknown. In order to characterize this mutation, we modeled the three-dimensional ASM structure using the recent available crystal of the mammalian ASM as a template. We found that the p.Ala359Asp mutation is localized in the hydrophobic core and far from the sphingomyelin binding site. However, energy function calculations using statistical potentials indicate that the mutation causes a decrease in ASM stability. Therefore, we investigated the functional effect of the p.Ala359Asp mutation in ASM expression, secretion, localization and activity in human fibroblasts. We found a 3.8% residual ASM activity compared to the wild-type enzyme, without changes in the other parameters evaluated. These results support the hypothesis that the p.Ala359Asp mutation causes structural alterations in the hydrophobic environment where ASM is located, decreasing its enzymatic activity. A similar effect was observed in other previously described NPDB mutations located outside the active site of the enzyme. This work shows the first full size ASM mutant model describe at date, providing a complete analysis of the structural and functional effects of the p.Ala359Asp mutation over the stability and activity of the enzyme. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Guidelines for the functional annotation of microRNAs using the Gene Ontology

    PubMed Central

    D'Eustachio, Peter; Smith, Jennifer R.; Zampetaki, Anna

    2016-01-01

    MicroRNA regulation of developmental and cellular processes is a relatively new field of study, and the available research data have not been organized to enable its inclusion in pathway and network analysis tools. The association of gene products with terms from the Gene Ontology is an effective method to analyze functional data, but until recently there has been no substantial effort dedicated to applying Gene Ontology terms to microRNAs. Consequently, when performing functional analysis of microRNA data sets, researchers have had to rely instead on the functional annotations associated with the genes encoding microRNA targets. In consultation with experts in the field of microRNA research, we have created comprehensive recommendations for the Gene Ontology curation of microRNAs. This curation manual will enable provision of a high-quality, reliable set of functional annotations for the advancement of microRNA research. Here we describe the key aspects of the work, including development of the Gene Ontology to represent this data, standards for describing the data, and guidelines to support curators making these annotations. The full microRNA curation guidelines are available on the GO Consortium wiki (http://wiki.geneontology.org/index.php/MicroRNA_GO_annotation_manual). PMID:26917558

  17. Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

    1990-01-01

    This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

  18. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    ERIC Educational Resources Information Center

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  19. Position Description Analysis: A Method for Describing Academic Roles and Functions.

    ERIC Educational Resources Information Center

    Renner, K. Edward; Skibbens, Ronald J.

    1990-01-01

    The Position Description Analysis method for assessing the discrepancy between status quo and specializations needed by institutions to meet new demands and expectations is presented using Dalhousie University (Nova Scotia) as a case study. Dramatic realignment of fields of specialization and change strategies accommodating the aging professoriate…

  20. A Behavior Analysis Approach toward Chronic Food Refusal in Children with Gastrostomy-Tube Dependency.

    ERIC Educational Resources Information Center

    Luiselli, James K.; Luiselli, Tracy Evans

    1995-01-01

    This report describes a behavior analysis treatment approach to establishing oral feeding in children with multiple developmental disabilities and gastrostomy-tube dependency. Pretreatment screening, functional assessment, and treatment are reported as implemented within a behavioral consultation model. A case study illustrates the sequence and…

  1. Creation and genomic analysis of irradiation hybrids in Populus

    Treesearch

    Matthew S. Zinkgraf; K. Haiby; M.C. Lieberman; L. Comai; I.M. Henry; Andrew Groover

    2016-01-01

    Establishing efficient functional genomic systems for creating and characterizing genetic variation in forest trees is challenging. Here we describe protocols for creating novel gene-dosage variation in Populus through gamma-irradiation of pollen, followed by genomic analysis to identify chromosomal regions that have been deleted or inserted in...

  2. Functional analysis of pathogenicity proteins of the potato cyst nematode Globodera rostochiensis using RNAi.

    PubMed

    Chen, Qing; Rehman, S; Smant, G; Jones, John T

    2005-07-01

    RNA interference (RNAi) has been used widely as a tool for examining gene function and a method that allows its use with plant-parasitic nematodes recently has been described. Here, we use a modified method to analyze the function of secreted beta-1,4, endoglucanases of the potato cyst nematode Globodera rostochiensis, the first in vivo functional analysis of a pathogenicity protein of a plant-parasitic nematode. Knockout of the beta-1,4, endoglucanases reduced the ability of the nematodes to invade roots. We also use RNAi to show that gr-ams-1, a secreted protein of the main sense organs (the amphids), is essential for host location.

  3. Metropolitan Forensic Anthropology Team (MFAT) studies in identification: 1. Race and sex assessment by discriminant function analysis of the postcranial skeleton.

    PubMed

    Taylor, J V; DiBennardo, R; Linares, G H; Goldman, A D; DeForest, P R

    1984-07-01

    A case study is presented to demonstrate the utility of the team approach to the identification of human remains, and to illustrate a methodological innovation developed by MFAT. Case 1 represents the first of several planned case studies, each designed to present new methodological solutions to standard problems in identification. The present case describes a test, by application, of race and sex assessment of the postcranial skeleton by discriminant function analysis.

  4. A free boundary approach to the Rosensweig instability of ferrofluids

    NASA Astrophysics Data System (ADS)

    Parini, Enea; Stylianou, Athanasios

    2018-04-01

    We establish the existence of saddle points for a free boundary problem describing the two-dimensional free surface of a ferrofluid undergoing normal field instability. The starting point is the ferrohydrostatic equations for the magnetic potentials in the ferrofluid and air, and the function describing their interface. These constitute the strong form for the Euler-Lagrange equations of a convex-concave functional, which we extend to include interfaces that are not necessarily graphs of functions. Saddle points are then found by iterating the direct method of the calculus of variations and applying classical results of convex analysis. For the existence part, we assume a general nonlinear magnetization law; for a linear law, we also show, via convex duality, that the saddle point is a constrained minimizer of the relevant energy functional.

  5. Functional Interaction Network Construction and Analysis for Disease Discovery.

    PubMed

    Wu, Guanming; Haw, Robin

    2017-01-01

    Network-based approaches project seemingly unrelated genes or proteins onto a large-scale network context, therefore providing a holistic visualization and analysis platform for genomic data generated from high-throughput experiments, reducing the dimensionality of data via using network modules and increasing the statistic analysis power. Based on the Reactome database, the most popular and comprehensive open-source biological pathway knowledgebase, we have developed a highly reliable protein functional interaction network covering around 60 % of total human genes and an app called ReactomeFIViz for Cytoscape, the most popular biological network visualization and analysis platform. In this chapter, we describe the detailed procedures on how this functional interaction network is constructed by integrating multiple external data sources, extracting functional interactions from human curated pathway databases, building a machine learning classifier called a Naïve Bayesian Classifier, predicting interactions based on the trained Naïve Bayesian Classifier, and finally constructing the functional interaction database. We also provide an example on how to use ReactomeFIViz for performing network-based data analysis for a list of genes.

  6. Design for cyclic loading endurance of composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.

    1993-01-01

    The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.

  7. Frequency-phase analysis of resting-state functional MRI

    PubMed Central

    Goelman, Gadi; Dan, Rotem; Růžička, Filip; Bezdicek, Ondrej; Růžička, Evžen; Roth, Jan; Vymazal, Josef; Jech, Robert

    2017-01-01

    We describe an analysis method that characterizes the correlation between coupled time-series functions by their frequencies and phases. It provides a unified framework for simultaneous assessment of frequency and latency of a coupled time-series. The analysis is demonstrated on resting-state functional MRI data of 34 healthy subjects. Interactions between fMRI time-series are represented by cross-correlation (with time-lag) functions. A general linear model is used on the cross-correlation functions to obtain the frequencies and phase-differences of the original time-series. We define symmetric, antisymmetric and asymmetric cross-correlation functions that correspond respectively to in-phase, 90° out-of-phase and any phase difference between a pair of time-series, where the last two were never introduced before. Seed maps of the motor system were calculated to demonstrate the strength and capabilities of the analysis. Unique types of functional connections, their dominant frequencies and phase-differences have been identified. The relation between phase-differences and time-delays is shown. The phase-differences are speculated to inform transfer-time and/or to reflect a difference in the hemodynamic response between regions that are modulated by neurotransmitters concentration. The analysis can be used with any coupled functions in many disciplines including electrophysiology, EEG or MEG in neuroscience. PMID:28272522

  8. Nonlinear analysis of a rotor-bearing system using describing functions

    NASA Astrophysics Data System (ADS)

    Maraini, Daniel; Nataraj, C.

    2018-04-01

    This paper presents a technique for modelling the nonlinear behavior of a rotor-bearing system with Hertzian contact, clearance, and rotating unbalance. The rotor-bearing system is separated into linear and nonlinear components, and the nonlinear bearing force is replaced with an equivalent describing function gain. The describing function captures the relationship between the amplitude of the fundamental input to the nonlinearity and the fundamental output. The frequency response is constructed for various values of the clearance parameter, and the results show the presence of a jump resonance in bearings with both clearance and preload. Nonlinear hardening type behavior is observed in the case with clearance and softening behavior is observed for the case with preload. Numerical integration is also carried out on the nonlinear equations of motion showing strong agreement with the approximate solution. This work could easily be extended to include additional nonlinearities that arise from defects, providing a powerful diagnostic tool.

  9. LOGISTIC FUNCTION PROFILE FIT: A least-squares program for fitting interface profiles to an extended logistic function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirchhoff, William H.

    2012-09-15

    The extended logistic function provides a physically reasonable description of interfaces such as depth profiles or line scans of surface topological or compositional features. It describes these interfaces with the minimum number of parameters, namely, position, width, and asymmetry. Logistic Function Profile Fit (LFPF) is a robust, least-squares fitting program in which the nonlinear extended logistic function is linearized by a Taylor series expansion (equivalent to a Newton-Raphson approach) with no apparent introduction of bias in the analysis. The program provides reliable confidence limits for the parameters when systematic errors are minimal and provides a display of the residuals frommore » the fit for the detection of systematic errors. The program will aid researchers in applying ASTM E1636-10, 'Standard practice for analytically describing sputter-depth-profile and linescan-profile data by an extended logistic function,' and may also prove useful in applying ISO 18516: 2006, 'Surface chemical analysis-Auger electron spectroscopy and x-ray photoelectron spectroscopy-determination of lateral resolution.' Examples are given of LFPF fits to a secondary ion mass spectrometry depth profile, an Auger surface line scan, and synthetic data generated to exhibit known systematic errors for examining the significance of such errors to the extrapolation of partial profiles.« less

  10. Sensor planning for moving targets

    NASA Astrophysics Data System (ADS)

    Musman, Scott A.; Lehner, Paul; Elsaesser, Chris

    1994-10-01

    Planning a search for moving ground targets is difficult for humans and computationally intractable. This paper describes a technique to solve such problems. The main idea is to combine probability of detection assessments with computational search heuristics to generate sensor plans which approximately maximize either the probability of detection or a user- specified knowledge function (e.g., determining the target's probable destination; locating the enemy tanks). In contrast to super computer-based moving target search planning, our technique has been implemented using workstation technology. The data structures generated by sensor planning can be used to evaluate sensor reports during plan execution. Our system revises its objective function with each sensor report, allowing the user to assess both the current situation as well as the expected value of future information. This capability is particularly useful in situations involving a high rate of sensor reporting, helping the user focus his attention on sensors reports most pertinent to current needs. Our planning approach is implemented in a three layer architecture. The layers are: mobility analysis, followed by sensor coverage analysis, and concluding with sensor plan analysis. It is possible using these layers to describe the physical, spatial, and temporal characteristics of a scenario in the first two layers, and customize the final analysis to specific intelligence objectives. The architecture also allows a user to customize operational parameters in each of the three major components of the system. As examples of these performance options, we briefly describe the mobility analysis and discuss issues affecting sensor plan analysis.

  11. MALDI MS-based Composition Analysis of the Polymerization Reaction of Toluene Diisocyanate (TDI) and Ethylene Glycol (EG).

    PubMed

    Ahn, Yeong Hee; Lee, Yeon Jung; Kim, Sung Ho

    2015-01-01

    This study describes an MS-based analysis method for monitoring changes in polymer composition during the polyaddition polymerization reaction of toluene diisocyanate (TDI) and ethylene glycol (EG). The polymerization was monitored as a function of reaction time using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS). The resulting series of polymer adducts terminated with various end-functional groups were precisely identified and the relative compositions of those series were estimated. A new MALDI MS data interpretation method was developed, consisting of a peak-resolving algorithm for overlapping peaks in MALDI MS spectra, a retrosynthetic analysis for the generation of reduced unit mass peaks, and a Gaussian fit-based selection of the most prominent polymer series among the reconstructed unit mass peaks. This method of data interpretation avoids errors originating from side reactions due to the presence of trace water in the reaction mixture or MALDI analysis. Quantitative changes in the relative compositions of the resulting polymer products were monitored as a function of reaction time. These results demonstrate that the mass data interpretation method described herein can be a powerful tool for estimating quantitative changes in the compositions of polymer products arising during a polymerization reaction.

  12. Flux frequency analysis of seasonally dry ecosystem fluxes in two unique biomes of Sonora Mexico

    NASA Astrophysics Data System (ADS)

    Verduzco, V. S.; Yepez, E. A.; Robles-Morua, A.; Garatuza, J.; Rodriguez, J. C.; Watts, C.

    2013-05-01

    Complex dynamics from the interactions of ecosystems processes makes difficult to model the behavior of ecosystems fluxes of carbon and water in response to the variation of environmental and biological drivers. Although process oriented ecosystem models are critical tools for studying land-atmosphere fluxes, its validity depends on the appropriate parameterization of equations describing temporal and spatial changes of model state variables and their interactions. This constraint often leads to discrepancies between model simulations and observed data that reduce models reliability especially in arid and semiarid ecosystems. In the semiarid north western Mexico, ecosystem processes are fundamentally controlled by the seasonality of water and the intermittence of rain pulses which are conditions that require calibration of specific fitting functions to describe the response of ecosystem variables (i.e. NEE, GPP, ET, respiration) to these wetting and drying periods. The goal is to find functions that describe the magnitude of ecosystem fluxes during individual rain pulses and the seasonality of the ecosystem. Relaying on five years of eddy covariance flux data of a tropical dry forest and a subtropical shrubland we present a flux frequency analysis that describe the variation of net ecosystem exchange (NEE) of CO2 to highlight the relevance of pulse driven dynamics controlling this flux. Preliminary results of flux frequency analysis of NEE indicate that these ecosystems are strongly controlled by the frequency distribution of rain. Also, the output of fitting functions for NEE, GPP, ET and respiration using semi-empirical functions applied at specific rain pulses compared with season-long statistically generated simulations do not agree. Seasonality and the intrinsic nature of individual pulses have different effects on ecosystem flux responses. This suggests that relationships between the nature of seasonality and individual pulses can help improve the parameterization of process oriented ecosystem models.

  13. Graph analysis of functional brain networks: practical issues in translational neuroscience

    PubMed Central

    De Vico Fallani, Fabrizio; Richiardi, Jonas; Chavez, Mario; Achard, Sophie

    2014-01-01

    The brain can be regarded as a network: a connected system where nodes, or units, represent different specialized regions and links, or connections, represent communication pathways. From a functional perspective, communication is coded by temporal dependence between the activities of different brain areas. In the last decade, the abstract representation of the brain as a graph has allowed to visualize functional brain networks and describe their non-trivial topological properties in a compact and objective way. Nowadays, the use of graph analysis in translational neuroscience has become essential to quantify brain dysfunctions in terms of aberrant reconfiguration of functional brain networks. Despite its evident impact, graph analysis of functional brain networks is not a simple toolbox that can be blindly applied to brain signals. On the one hand, it requires the know-how of all the methodological steps of the pipeline that manipulate the input brain signals and extract the functional network properties. On the other hand, knowledge of the neural phenomenon under study is required to perform physiologically relevant analysis. The aim of this review is to provide practical indications to make sense of brain network analysis and contrast counterproductive attitudes. PMID:25180301

  14. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  15. Spectrum orbit utilization program technical manual SOUP5 Version 3.8

    NASA Technical Reports Server (NTRS)

    Davidson, J.; Ottey, H. R.; Sawitz, P.; Zusman, F. S.

    1984-01-01

    The underlying engineering and mathematical models as well as the computational methods used by the SOUP5 analysis programs, which are part of the R2BCSAT-83 Broadcast Satellite Computational System, are described. Included are the algorithms used to calculate the technical parameters and references to the relevant technical literature. The system provides the following capabilities: requirements file maintenance, data base maintenance, elliptical satellite beam fitting to service areas, plan synthesis from specified requirements, plan analysis, and report generation/query. Each of these functions are briefly described.

  16. Angular velocities, angular accelerations, and coriolis accelerations

    NASA Technical Reports Server (NTRS)

    Graybiel, A.

    1975-01-01

    Weightlessness, rotating environment, and mathematical analysis of Coriolis acceleration is described for man's biological effective force environments. Effects on the vestibular system are summarized, including the end organs, functional neurology, and input-output relations. Ground-based studies in preparation for space missions are examined, including functional tests, provocative tests, adaptive capacity tests, simulation studies, and antimotion sickness.

  17. Russian Function Catalog and Rolebooks. Methods for Determining Language Objectives and Criteria, Volume XIII.

    ERIC Educational Resources Information Center

    Setzler, Hubert H., Jr.; And Others

    A Russian Function Catalog and Instructor and Advisor Rolebooks for Russian are presented. The catalog and rolebooks are part of the communication/language objectives-based system (C/LOBS), which supports the front-end analysis efforts of the Defense Language Institute Foreign Language Center. The C/LOBS projects, which is described in 13 volumes…

  18. Prereaders' Understanding of the Function of Print: Characteristic Trends in the Process.

    ERIC Educational Resources Information Center

    Isom, Bess A.; Casteel, Carolyn P.

    1987-01-01

    Assessed the knowledge of print of 324 children aged 3, 4, and 5 years by means of a set of 20 cards containing commercial logos and signs. Analysis of individual responses resulted in seven common categories describing the behavior of children as they progressed to recognition of the function of print. (NH)

  19. Sequential Modification and the Identification of Instructional Components Occasioning Self-Injurious Behavior

    ERIC Educational Resources Information Center

    Reed, Derek D.; Luiselli, James K.; Morizio, Lindsey C.; Child, Stephanie N.

    2010-01-01

    The present study describes a case of a 9-year-old girl diagnosed on the autism spectrum who averaged nearly 1200 hand-to-head self-injuries (+attempts) per school day. Given the resources of the school and the significance of the self-injurious behavior (SIB), analog functional analysis is not possible. Moreover, functional assessment results…

  20. Methodology for the systems engineering process. Volume 1: System functional activities

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    Systems engineering is examined in terms of functional activities that are performed in the conduct of a system definition/design, and system development is described in a parametric analysis that combines functions, performance, and design variables. Emphasis is placed on identification of activities performed by design organizations, design specialty groups, as well as a central systems engineering organizational element. Identification of specific roles and responsibilities for doing functions, and monitoring and controlling activities within the system development operation are also emphasized.

  1. The use of rational functions in numerical quadrature

    NASA Astrophysics Data System (ADS)

    Gautschi, Walter

    2001-08-01

    Quadrature problems involving functions that have poles outside the interval of integration can profitably be solved by methods that are exact not only for polynomials of appropriate degree, but also for rational functions having the same (or the most important) poles as the function to be integrated. Constructive and computational tools for accomplishing this are described and illustrated in a number of quadrature contexts. The superiority of such rational/polynomial methods is shown by an analysis of the remainder term and documented by numerical examples.

  2. CTEQ-TEA parton distribution functions and HERA Run I and II combined data

    NASA Astrophysics Data System (ADS)

    Hou, Tie-Jiun; Dulat, Sayipjamal; Gao, Jun; Guzzi, Marco; Huston, Joey; Nadolsky, Pavel; Pumplin, Jon; Schmidt, Carl; Stump, Daniel; Yuan, C.-P.

    2017-02-01

    We analyze the impact of the recent HERA Run I +II combination of inclusive deep inelastic scattering cross-section data on the CT14 global analysis of parton distribution functions (PDFs). New PDFs at next-to-leading order and next-to-next-to-leading order, called CT14 HERA 2 , are obtained by a refit of the CT14 data ensembles, in which the HERA Run I combined measurements are replaced by the new HERA Run I +II combination. The CT14 functional parametrization of PDFs is flexible enough to allow good descriptions of different flavor combinations, so we use the same parametrization for CT14 HERA 2 but with an additional shape parameter for describing the strange quark PDF. We find that the HERA I +II data can be fit reasonably well, and both CT14 and CT14 HERA 2 PDFs can describe equally well the non-HERA data included in our global analysis. Because the CT14 and CT14 HERA 2 PDFs agree well within the PDF errors, we continue to recommend CT14 PDFs for the analysis of LHC Run 2 experiments.

  3. A statistical approach to deriving subsystem specifications. [for spacecraft shock and vibrational environment tests

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1974-01-01

    In order to produce cost effective environmental test programs, the test specifications must be realistic and to be useful, they must be available early in the life of a program. This paper describes a method for achieving such specifications for subsystems by utilizing the results of a statistical analysis of data acquired at subsystem mounting locations during system level environmental tests. The paper describes the details of this statistical analysis. The resultant recommended levels are a function of the subsystems' mounting location in the spacecraft. Methods of determining this mounting 'zone' are described. Recommendations are then made as to which of the various problem areas encountered should be pursued further.

  4. Linearised and non-linearised isotherm models optimization analysis by error functions and statistical means

    PubMed Central

    2014-01-01

    In adsorption study, to describe sorption process and evaluation of best-fitting isotherm model is a key analysis to investigate the theoretical hypothesis. Hence, numerous statistically analysis have been extensively used to estimate validity of the experimental equilibrium adsorption values with the predicted equilibrium values. Several statistical error analysis were carried out. In the present study, the following statistical analysis were carried out to evaluate the adsorption isotherm model fitness, like the Pearson correlation, the coefficient of determination and the Chi-square test, have been used. The ANOVA test was carried out for evaluating significance of various error functions and also coefficient of dispersion were evaluated for linearised and non-linearised models. The adsorption of phenol onto natural soil (Local name Kalathur soil) was carried out, in batch mode at 30 ± 20 C. For estimating the isotherm parameters, to get a holistic view of the analysis the models were compared between linear and non-linear isotherm models. The result reveled that, among above mentioned error functions and statistical functions were designed to determine the best fitting isotherm. PMID:25018878

  5. Formula 1--A Mathematical Microworld with CAS: Analysis of Learning Opportunities and Experiences with Students

    ERIC Educational Resources Information Center

    Gerny, Marianne; Alpers, Burkhard

    2004-01-01

    In this article we describe a mathematical microworld for investigating car motion on a racing course and its use with a group of grade 12 students. The microworld is concerned with the mathematical construction of courses and functions which describe car motion. It is implemented in the computer algebra system, Maple[R], which provides the means…

  6. Modelling the human operator of slowly responding systems using linear models

    NASA Technical Reports Server (NTRS)

    Veldhuyzen, W.

    1977-01-01

    Control of slowly responding systems, such as, helmsman steering a large ship, is examined. It is shown that the describing function techniques are useful in analyzing the control behavior of the helmsman. Models are developed to describe the helmsman's control behavior. It is shown that the cross over model is applicable to the analysis of control of slowly responding systems.

  7. Digital controller design: Continuous and discrete describing function analysis of the IPS system

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The dynamic equations and the mathematical model of the continuous-data IPS control system are developed. The IPS model considered included one flexible body mode and was hardmounted to the Orbiter/Pallet. The model contains equations describing a torque feed-forward loop (using accelerometers as inputs) which will aid in reducing the pointing errors caused by Orbiter disturbances.

  8. HomER: a review of time-series analysis methods for near-infrared spectroscopy of the brain

    PubMed Central

    Huppert, Theodore J.; Diamond, Solomon G.; Franceschini, Maria A.; Boas, David A.

    2009-01-01

    Near-infrared spectroscopy (NIRS) is a noninvasive neuroimaging tool for studying evoked hemodynamic changes within the brain. By this technique, changes in the optical absorption of light are recorded over time and are used to estimate the functionally evoked changes in cerebral oxyhemoglobin and deoxyhemoglobin concentrations that result from local cerebral vascular and oxygen metabolic effects during brain activity. Over the past three decades this technology has continued to grow, and today NIRS studies have found many niche applications in the fields of psychology, physiology, and cerebral pathology. The growing popularity of this technique is in part associated with a lower cost and increased portability of NIRS equipment when compared with other imaging modalities, such as functional magnetic resonance imaging and positron emission tomography. With this increasing number of applications, new techniques for the processing, analysis, and interpretation of NIRS data are continually being developed. We review some of the time-series and functional analysis techniques that are currently used in NIRS studies, we describe the practical implementation of various signal processing techniques for removing physiological, instrumental, and motion-artifact noise from optical data, and we discuss the unique aspects of NIRS analysis in comparison with other brain imaging modalities. These methods are described within the context of the MATLAB-based graphical user interface program, HomER, which we have developed and distributed to facilitate the processing of optical functional brain data. PMID:19340120

  9. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  10. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and socio-ecological processes. Also there is important positive feedback of risk perception function to distribution function. Risk perception is essentially depends of short-term recent events impact in multi agent media. This is managed function. The generalized view of awareness function is proposed: α(t) = δΣ ic - rpi. Using this form separate parameters has been calculated. For example, risk perception function is about 15-55% of awareness function depends of education, age and social status of people. Also it was estimated that fraction of awareness function in damage function, and so in function of risk is about 15-20%. It means that no less than 8-12% of direct losses depend of short-term responsible behavior of 'information agents': social activity of experts, scientists, correct discussions on ethical issues in geo-sciences and media. Other 6-9% of losses are connected with level of public and professional education. This area is also should be field of responsibility of geo-scientists.

  11. Rasch Analysis for Instrument Development: Why, When, and How?

    ERIC Educational Resources Information Center

    Boone, William J.

    2016-01-01

    This essay describes Rasch analysis psychometric techniques and how such techniques can be used by life sciences education researchers to guide the development and use of surveys and tests. Specifically, Rasch techniques can be used to document and evaluate the measurement functioning of such instruments. Rasch techniques also allow researchers to…

  12. 77 FR 45282 - NRC Position on the Relationship Between General Design Criteria and Technical Specification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ..., are described in the final safety analysis report (FSAR). The staff safety evaluation documents the acceptability of these analyses, and it is the combination of the FSAR analyses and the staff safety evaluation... analysis, maintain their capability to perform their safety functions. Technical Specification Operability...

  13. Publication Bias in Research Synthesis: Sensitivity Analysis Using A Priori Weight Functions

    ERIC Educational Resources Information Center

    Vevea, Jack L.; Woods, Carol M.

    2005-01-01

    Publication bias, sometimes known as the "file-drawer problem" or "funnel-plot asymmetry," is common in empirical research. The authors review the implications of publication bias for quantitative research synthesis (meta-analysis) and describe existing techniques for detecting and correcting it. A new approach is proposed that is suitable for…

  14. Mathematical modelling and linear stability analysis of laser fusion cutting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hermanns, Torsten; Schulz, Wolfgang; Vossen, Georg

    A model for laser fusion cutting is presented and investigated by linear stability analysis in order to study the tendency for dynamic behavior and subsequent ripple formation. The result is a so called stability function that describes the correlation of the setting values of the process and the process’ amount of dynamic behavior.

  15. Standardized residual as response function for order identification of multi input intervention analysis

    NASA Astrophysics Data System (ADS)

    Suhartono, Lee, Muhammad Hisyam; Rezeki, Sri

    2017-05-01

    Intervention analysis is a statistical model in the group of time series analysis which is widely used to describe the effect of an intervention caused by external or internal factors. An example of external factors that often occurs in Indonesia is a disaster, both natural or man-made disaster. The main purpose of this paper is to provide the results of theoretical studies on identification step for determining the order of multi inputs intervention analysis for evaluating the magnitude and duration of the impact of interventions on time series data. The theoretical result showed that the standardized residuals could be used properly as response function for determining the order of multi inputs intervention model. Then, these results are applied for evaluating the impact of a disaster on a real case in Indonesia, i.e. the magnitude and duration of the impact of the Lapindo mud on the volume of vehicles on the highway. Moreover, the empirical results showed that the multi inputs intervention model can describe and explain accurately the magnitude and duration of the impact of disasters on a time series data.

  16. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  17. Structural Tailoring of Advanced Turboprops (STAT). Theoretical manual

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1992-01-01

    This manual describes the theories in the Structural Tailoring of Advanced Turboprops (STAT) computer program, which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analyses include an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution (1-p) forced response life prediction capability. The STAT constraints include blade stresses, blade resonances, flutter, tip displacements, and a 1-P forced response life fraction. The STAT variables include all blade internal and external geometry parameters needed to define a composite material blade. The STAT objective function is dependent upon a blade baseline definition which the user supplies to describe a current blade design for cost optimization or for the tailoring of an aeroelastic scale model.

  18. Structural Tailoring of Advanced Turboprops (STAT). Theoretical manual

    NASA Astrophysics Data System (ADS)

    Brown, K. W.

    1992-10-01

    This manual describes the theories in the Structural Tailoring of Advanced Turboprops (STAT) computer program, which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analyses include an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution (1-p) forced response life prediction capability. The STAT constraints include blade stresses, blade resonances, flutter, tip displacements, and a 1-P forced response life fraction. The STAT variables include all blade internal and external geometry parameters needed to define a composite material blade. The STAT objective function is dependent upon a blade baseline definition which the user supplies to describe a current blade design for cost optimization or for the tailoring of an aeroelastic scale model.

  19. Collision Avoidance Functional Requirements for Step 1. Revision 6

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This Functional Requirements Document (FRD) describes the flow of requirements from the high level operational objectives down to the functional requirements specific to cooperative collision avoidance for high altitude, long endurance unmanned aircraft systems. These are further decomposed into performance and safety guidelines that are backed up by analysis or references to various documents or research findings. The FRD should be considered when establishing future policies, procedures, and standards pertaining to cooperative collision avoidance.

  20. Social network types and functional dependency in older adults in Mexico.

    PubMed

    Doubova Dubova, Svetlana Vladislavovna; Pérez-Cuevas, Ricardo; Espinosa-Alarcón, Patricia; Flores-Hernández, Sergio

    2010-02-27

    Social networks play a key role in caring for older adults. A better understanding of the characteristics of different social networks types (TSNs) in a given community provides useful information for designing policies to care for this age group. Therefore this study has three objectives: 1) To derive the TSNs among older adults affiliated with the Mexican Institute of Social Security; 2) To describe the main characteristics of the older adults in each TSN, including the instrumental and economic support they receive and their satisfaction with the network; 3) To determine the association between functional dependency and the type of social network. Secondary data analysis of the 2006 Survey of Autonomy and Dependency (N = 3,348). The TSNs were identified using the structural approach and cluster analysis. The association between functional dependency and the TSNs was evaluated with Poisson regression with robust variance analysis in which socio-demographic characteristics, lifestyle and medical history covariates were included. We identified five TSNs: diverse with community participation (12.1%), diverse without community participation (44.3%); widowed (32.0%); nonfriends-restricted (7.6%); nonfamily-restricted (4.0%). Older adults belonging to widowed and restricted networks showed a higher proportion of dependency, negative self-rated health and depression. Older adults with functional dependency more likely belonged to a widowed network (adjusted prevalence ratio 1.5; 95%CI: 1.1-2.1). The derived TSNs were similar to those described in developed countries. However, we identified the existence of a diverse network without community participation and a widowed network that have not been previously described. These TSNs and restricted networks represent a potential unmet need of social security affiliates.

  1. Inference of Ancestry in Forensic Analysis II: Analysis of Genetic Data.

    PubMed

    Santos, Carla; Phillips, Chris; Gomez-Tato, A; Alvarez-Dios, J; Carracedo, Ángel; Lareu, Maria Victoria

    2016-01-01

    Three approaches applicable to the analysis of forensic ancestry-informative marker data-STRUCTURE, principal component analysis, and the Snipper Bayesian classification system-are reviewed. Detailed step-by-step guidance is provided for adjusting parameter settings in STRUCTURE with particular regard to their effect when differentiating populations. Several enhancements to the Snipper online forensic classification portal are described, highlighting the added functionality they bring to particular aspects of ancestry-informative SNP analysis in a forensic context.

  2. Dynamical analysis for a scalar-tensor model with kinetic and nonminimal couplings

    NASA Astrophysics Data System (ADS)

    Granda, L. N.; Jimenez, D. F.

    We study the autonomous system for a scalar-tensor model of dark energy with nonminimal coupling to curvature and nonminimal kinetic coupling to the Einstein tensor. The critical points describe important stable asymptotic scenarios including quintessence, phantom and de Sitter attractor solutions. Two functional forms for the coupling functions and the scalar potential were considered: power-law and exponential functions of the scalar field. For power-law couplings, the restrictions on stable quintessence and phantom solutions lead to asymptotic freedom regime for the gravitational interaction. For the exponential functions, the stable quintessence, phantom or de Sitter solutions allow asymptotic behaviors where the effective Newtonian coupling can reach either the asymptotic freedom regime or constant value. The phantom solutions could be realized without appealing to ghost degrees of freedom. Transient inflationary and radiation dominated phases can also be described.

  3. Enunciative categories in the description of language functioning of mothers and infants aged 1-4 months.

    PubMed

    Kruel, Cristina Saling; Rechia, Inaê Costa; Oliveira, Luciéle Dias; Souza, Ana Paula Ramos de

    2016-01-01

    To present categories which explain the language functioning between infants and their mothers from Benveniste's concept of semiotic system, and verify whether such categories can be described numerically. Four mother-infant dyads were monitored in three stages. The first study consisted of a qualitative analysis of the transcribed video recordings conducted in each stage. We intended to identify the enunciative principles associated with the relationship between the semiotic system of the infant's body and their mother's language, namely, the principles of interpretancy and homology. The other study was conducted by means of a descriptive numerical analysis of the enunciative categories and the infant caregiver scale of behavior, using the ELAN software (EUDICO Linguistic Anotador). Mutuality in mother-infant interactions was observed in most of the scenes analyzed. Productive enunciative categories demonstrated in the infant's demand/mother's interpretation relation was identified in homology and interpretancy. It was also possible to use these categories to describe the mother-infant interactions numerically. In addition, other categories emerged because there are other subtypes of maternal productions not directly related to infant demand. This shows that infants are exposed to language of heterogeneous characteristics. The concept of semiotic system allowed the proposition of language functioning categories identifiable in the mother-infant relationship. Such categories were described numerically.

  4. Description of a user-oriented geographic information system - The resource analysis program

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Mokma, D. L.

    1980-01-01

    This paper describes the Resource Analysis Program, an applied geographic information system. Several applications are presented which utilized soil, and other natural resource data, to develop integrated maps and data analyses. These applications demonstrate the methods of analysis and the philosophy of approach used in the mapping system. The applications are evaluated in reference to four major needs of a functional mapping system: data capture, data libraries, data analysis, and mapping and data display. These four criteria are then used to describe an effort to develop the next generation of applied mapping systems. This approach uses inexpensive microcomputers for field applications and should prove to be a viable entry point for users heretofore unable or unwilling to venture into applied computer mapping.

  5. Mutational Analysis of Escherichia coli MoeA: Two Functional Activities Map to the Active Site Cleft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols,J.; Xiang, S.; Schindelin, H.

    2007-01-01

    The molybdenum cofactor is ubiquitous in nature, and the pathway for Moco biosynthesis is conserved in all three domains of life. Recent work has helped to illuminate one of the most enigmatic steps in Moco biosynthesis, ligation of metal to molybdopterin (the organic component of the cofactor) to form the active cofactor. In Escherichia coli, the MoeA protein mediates ligation of Mo to molybdopterin while the MogA protein enhances this process in an ATP-dependent manner. The X-ray crystal structures for both proteins have been previously described as well as two essential MogA residues, Asp49 and Asp82. Here we describe amore » detailed mutational analysis of the MoeA protein. Variants of conserved residues at the putative active site of MoeA were analyzed for a loss of function in two different, previously described assays, one employing moeA{sup -} crude extracts and the other utilizing a defined system. Oddly, no correlation was observed between the activity in the two assays. In fact, our results showed a general trend toward an inverse relationship between the activity in each assay. Moco binding studies indicated a strong correlation between a variant's ability to bind Moco and its activity in the purified component assay. Crystal structures of the functionally characterized MoeA variants revealed no major structural changes, indicating that the functional differences observed are not due to disruption of the protein structure. On the basis of these results, two different functional areas were assigned to regions at or near the MoeA active site cleft.« less

  6. Mathematical modelling of the growth of human fetus anatomical structures.

    PubMed

    Dudek, Krzysztof; Kędzia, Wojciech; Kędzia, Emilia; Kędzia, Alicja; Derkowski, Wojciech

    2017-09-01

    The goal of this study was to present a procedure that would enable mathematical analysis of the increase of linear sizes of human anatomical structures, estimate mathematical model parameters and evaluate their adequacy. Section material consisted of 67 foetuses-rectus abdominis muscle and 75 foetuses- biceps femoris muscle. The following methods were incorporated to the study: preparation and anthropologic methods, image digital acquisition, Image J computer system measurements and statistical analysis method. We used an anthropologic method based on age determination with the use of crown-rump length-CRL (V-TUB) by Scammon and Calkins. The choice of mathematical function should be based on a real course of the curve presenting growth of anatomical structure linear size Ύ in subsequent weeks t of pregnancy. Size changes can be described with a segmental-linear model or one-function model with accuracy adequate enough for clinical purposes. The interdependence of size-age is described with many functions. However, the following functions are most often considered: linear, polynomial, spline, logarithmic, power, exponential, power-exponential, log-logistic I and II, Gompertz's I and II and von Bertalanffy's function. With the use of the procedures described above, mathematical models parameters were assessed for V-PL (the total length of body) and CRL body length increases, rectus abdominis total length h, its segments hI, hII, hIII, hIV, as well as biceps femoris length and width of long head (LHL and LHW) and of short head (SHL and SHW). The best adjustments to measurement results were observed in the exponential and Gompertz's models.

  7. Stress analysis of advanced attack helicopter composite main rotor blade root end lug

    NASA Technical Reports Server (NTRS)

    Baker, D. J.

    1982-01-01

    Stress analysis of the Advanced Attack Helicopter (AAH) composite main rotor blade root end lug is described. The stress concentration factor determined from a finite element analysis is compared to an empirical value used in the lug design. The analysis and test data indicate that the stress concentration is primarily a function of configuration and independent of the range of material properties typical of Kevlar-49/epoxy and glass epoxy.

  8. Functional Connectivity Parcellation of the Human Thalamus by Independent Component Analysis.

    PubMed

    Zhang, Sheng; Li, Chiang-Shan R

    2017-11-01

    As a key structure to relay and integrate information, the thalamus supports multiple cognitive and affective functions through the connectivity between its subnuclei and cortical and subcortical regions. Although extant studies have largely described thalamic regional functions in anatomical terms, evidence accumulates to suggest a more complex picture of subareal activities and connectivities of the thalamus. In this study, we aimed to parcellate the thalamus and examine whole-brain connectivity of its functional clusters. With resting state functional magnetic resonance imaging data from 96 adults, we used independent component analysis (ICA) to parcellate the thalamus into 10 components. On the basis of the independence assumption, ICA helps to identify how subclusters overlap spatially. Whole brain functional connectivity of each subdivision was computed for independent component's time course (ICtc), which is a unique time series to represent an IC. For comparison, we computed seed-region-based functional connectivity using the averaged time course across all voxels within a thalamic subdivision. The results showed that, at p < 10 -6 , corrected, 49% of voxels on average overlapped among subdivisions. Compared with seed-region analysis, ICtc analysis revealed patterns of connectivity that were more distinguished between thalamic clusters. ICtc analysis demonstrated thalamic connectivity to the primary motor cortex, which has eluded the analysis as well as previous studies based on averaged time series, and clarified thalamic connectivity to the hippocampus, caudate nucleus, and precuneus. The new findings elucidate functional organization of the thalamus and suggest that ICA clustering in combination with ICtc rather than seed-region analysis better distinguishes whole-brain connectivities among functional clusters of a brain region.

  9. Teleconsultation in school settings: linking classroom teachers and behavior analysts through web-based technology.

    PubMed

    Frieder, Jessica E; Peterson, Stephanie M; Woodward, Judy; Crane, Jaelee; Garner, Marlane

    2009-01-01

    This paper describes a technically driven, collaborative approach to assessing the function of problem behavior using web-based technology. A case example is provided to illustrate the process used in this pilot project. A school team conducted a functional analysis with a child who demonstrated challenging behaviors in a preschool setting. Behavior analysts at a university setting provided the school team with initial workshop trainings, on-site visits, e-mail and phone communication, as well as live web-based feedback on functional analysis sessions. The school personnel implemented the functional analysis with high fidelity and scored the data reliably. Outcomes of the project suggest that there is great potential for collaboration via the use of web-based technologies for ongoing assessment and development of effective interventions. However, an empirical evaluation of this model should be conducted before wide-scale adoption is recommended.

  10. Genetic interaction analysis of point mutations enables interrogation of gene function at a residue-level resolution

    PubMed Central

    Braberg, Hannes; Moehle, Erica A.; Shales, Michael; Guthrie, Christine; Krogan, Nevan J.

    2014-01-01

    We have achieved a residue-level resolution of genetic interaction mapping – a technique that measures how the function of one gene is affected by the alteration of a second gene – by analyzing point mutations. Here, we describe how to interpret point mutant genetic interactions, and outline key applications for the approach, including interrogation of protein interaction interfaces and active sites, and examination of post-translational modifications. Genetic interaction analysis has proven effective for characterizing cellular processes; however, to date, systematic high-throughput genetic interaction screens have relied on gene deletions or knockdowns, which limits the resolution of gene function analysis and poses problems for multifunctional genes. Our point mutant approach addresses these issues, and further provides a tool for in vivo structure-function analysis that complements traditional biophysical methods. We also discuss the potential for genetic interaction mapping of point mutations in human cells and its application to personalized medicine. PMID:24842270

  11. Analysis/forecast experiments with a flow-dependent correlation function using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Carus, H.; Nestler, M. S.

    1986-01-01

    The use of a flow-dependent correlation function to improve the accuracy of an optimum interpolation (OI) scheme is examined. The development of the correlation function for the OI analysis scheme used for numerical weather prediction is described. The scheme uses a multivariate surface analysis over the oceans to model the pressure-wind error cross-correlation and it has the ability to use an error correlation function that is flow- and geographically-dependent. A series of four-day data assimilation experiments, conducted from January 5-9, 1979, were used to investigate the effect of the different features of the OI scheme (error correlation) on forecast skill for the barotropic lows and highs. The skill of the OI was compared with that of a successive correlation method (SCM) of analysis. It is observed that the largest difference in the correlation statistics occurred in barotropic and baroclinic lows and highs. The comparison reveals that the OI forecasts were more accurate than the SCM forecasts.

  12. Parameter Estimation of Actuators for Benchmark Active Control Technology (BACT) Wind Tunnel Model with Analysis of Wear and Aerodynamic Loading Effects

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Fung, Jimmy

    1998-01-01

    This report describes the development of transfer function models for the trailing-edge and upper and lower spoiler actuators of the Benchmark Active Control Technology (BACT) wind tunnel model for application to control system analysis and design. A simple nonlinear least-squares parameter estimation approach is applied to determine transfer function parameters from frequency response data. Unconstrained quasi-Newton minimization of weighted frequency response error was employed to estimate the transfer function parameters. An analysis of the behavior of the actuators over time to assess the effects of wear and aerodynamic load by using the transfer function models is also presented. The frequency responses indicate consistent actuator behavior throughout the wind tunnel test and only slight degradation in effectiveness due to aerodynamic hinge loading. The resulting actuator models have been used in design, analysis, and simulation of controllers for the BACT to successfully suppress flutter over a wide range of conditions.

  13. Pyridylamination as a means of analyzing complex sugar chains

    PubMed Central

    Hase, Sumihiro

    2010-01-01

    Herein, I describe pyridylamination for versatile analysis of sugar chains. The reducing ends of the sugar chains are tagged with 2-aminopyridine and the resultant chemically stable fluorescent derivatives are used for structural/functional analysis. Pyridylamination is an effective “operating system” for increasing sensitivity and simplifying the analytical procedures including mass spectrometry and NMR. Excellent separation of isomers is achieved by reversed-phase HPLC. However, separation is further improved by two-dimensional HPLC, which involves a combination of reversed-phase HPLC and size-fractionation HPLC. Moreover, a two-dimensional HPLC map is also useful for structural analysis. I describe a simple procedure for preparing homogeneous pyridylamino sugar chains that is less laborious than existing techniques and can be used for functional analysis (e.g., sugar-protein interaction). This novel approach was applied and some of the results are described: i) a glucosyl-serine type sugar chain found in blood coagulation factors; ii) discovery of endo-β-mannosidase (EC 3.2.1.152) and a new type plant α1,2-l-fucosidase; and iii) novel substrate specificity of a cytosolic α-mannosidase. Moreover, using homogeneous sugar chains of a size similar to in vivo substrates we were able to analyze interactions between sugar chains and proteins such as enzymes and lectins in detail. Interestingly, our studies reveal that some enzymes recognize a wider region of the substrate than anticipated. PMID:20431262

  14. A FORTRAN program for the analysis of linear continuous and sample-data systems

    NASA Technical Reports Server (NTRS)

    Edwards, J. W.

    1976-01-01

    A FORTRAN digital computer program which performs the general analysis of linearized control systems is described. State variable techniques are used to analyze continuous, discrete, and sampled data systems. Analysis options include the calculation of system eigenvalues, transfer functions, root loci, root contours, frequency responses, power spectra, and transient responses for open- and closed-loop systems. A flexible data input format allows the user to define systems in a variety of representations. Data may be entered by inputing explicit data matrices or matrices constructed in user written subroutines, by specifying transfer function block diagrams, or by using a combination of these methods.

  15. The Multivariate Temporal Response Function (mTRF) Toolbox: A MATLAB Toolbox for Relating Neural Signals to Continuous Stimuli.

    PubMed

    Crosse, Michael J; Di Liberto, Giovanni M; Bednar, Adam; Lalor, Edmund C

    2016-01-01

    Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter-often referred to as a temporal response function-that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application.

  16. Fast function-on-scalar regression with penalized basis expansions.

    PubMed

    Reiss, Philip T; Huang, Lei; Mennes, Maarten

    2010-01-01

    Regression models for functional responses and scalar predictors are often fitted by means of basis functions, with quadratic roughness penalties applied to avoid overfitting. The fitting approach described by Ramsay and Silverman in the 1990 s amounts to a penalized ordinary least squares (P-OLS) estimator of the coefficient functions. We recast this estimator as a generalized ridge regression estimator, and present a penalized generalized least squares (P-GLS) alternative. We describe algorithms by which both estimators can be implemented, with automatic selection of optimal smoothing parameters, in a more computationally efficient manner than has heretofore been available. We discuss pointwise confidence intervals for the coefficient functions, simultaneous inference by permutation tests, and model selection, including a novel notion of pointwise model selection. P-OLS and P-GLS are compared in a simulation study. Our methods are illustrated with an analysis of age effects in a functional magnetic resonance imaging data set, as well as a reanalysis of a now-classic Canadian weather data set. An R package implementing the methods is publicly available.

  17. Application of abstract harmonic analysis to the high-speed recognition of images

    NASA Technical Reports Server (NTRS)

    Usikov, D. A.

    1979-01-01

    Methods are constructed for rapidly computing correlation functions using the theory of abstract harmonic analysis. The theory developed includes as a particular case the familiar Fourier transform method for a correlation function which makes it possible to find images which are independent of their translation in the plane. Two examples of the application of the general theory described are the search for images, independent of their rotation and scale, and the search for images which are independent of their translations and rotations in the plane.

  18. FunShift: a database of function shift analysis on protein subfamilies

    PubMed Central

    Abhiman, Saraswathi; Sonnhammer, Erik L. L.

    2005-01-01

    Members of a protein family normally have a general biochemical function in common, but frequently one or more subgroups have evolved a slightly different function, such as different substrate specificity. It is important to detect such function shifts for a more accurate functional annotation. The FunShift database described here is a compilation of function shift analysis performed between subfamilies in protein families. It consists of two main components: (i) subfamilies derived from protein domain families and (ii) pairwise subfamily comparisons analyzed for function shift. The present release, FunShift 12, was derived from Pfam 12 and consists of 151 934 subfamilies derived from 7300 families. We carried out function shift analysis by two complementary methods on families with up to 500 members. From a total of 179 210 subfamily pairs, 62 384 were predicted to be functionally shifted in 2881 families. Each subfamily pair is provided with a markup of probable functional specificity-determining sites. Tools for searching and exploring the data are provided to make this database a valuable resource for protein function annotation. Knowledge of these functionally important sites will be useful for experimental biologists performing functional mutation studies. FunShift is available at http://FunShift.cgb.ki.se. PMID:15608176

  19. Mandarin Chinese Function Catalog and Rolebook. Method for Determining Language Objectives and Criteria, Volume IX.

    ERIC Educational Resources Information Center

    Setzler, Hubert H., Jr.; And Others

    A Mandarin Chinese Function Catalog and Instructor Rolebook for Mandarin Chinese are presented. The catalog and rolebook are part of the communication/language objectives-based system (C/LOBS), which supports the front-end analysis efforts of the Defense Language Institute Foreign Language Center. The C/LOBS project, which is described in 13…

  20. The Association of Health and Functional Status with Private and Public Religious Practice among Rural, Ethnically Diverse, Older Adults with Diabetes

    ERIC Educational Resources Information Center

    Arcury, Thomas A.; Stafford, Jeanette M.; Bell, Ronny A.; Golden, Shannon L.; Snively, Beverly M.; Quandt, Sara A.

    2007-01-01

    Purpose: This analysis describes the association of health and functional status with private and public religious practice among ethnically diverse (African American, Native American, white) rural older adults with diabetes. Methods: Data were collected using a population-based, cross-sectional, stratified, random sample survey of 701…

  1. Building the ECON extension: Functionality and lessons learned

    Treesearch

    Fred C. Martin

    2008-01-01

    The functionality of the ECON extension to FVS is described with emphasis on the ability to dynamically interact with all elements of the FVS simulation process. Like other extensions, ECON is fully integrated within FVS. This integration allows: (1) analysis of multiple alternative tree-removal actions within a single simulation without altering “normal” stand...

  2. The Effects of Training and Performance Feedback during Behavioral Consultation on General Education Middle School Teachers' Integrity to Functional Analysis Procedures

    ERIC Educational Resources Information Center

    McKenney, Elizabeth L. W.; Waldron, Nancy; Conroy, Maureen

    2013-01-01

    This study describes the integrity with which 3 general education middle school teachers implemented functional analyses (FA) of appropriate behavior for students who typically engaged in disruption. A 4-step model consistent with behavioral consultation was used to support the assessment process. All analyses were conducted during ongoing…

  3. An exponential decay model for mediation.

    PubMed

    Fritz, Matthew S

    2014-10-01

    Mediation analysis is often used to investigate mechanisms of change in prevention research. Results finding mediation are strengthened when longitudinal data are used because of the need for temporal precedence. Current longitudinal mediation models have focused mainly on linear change, but many variables in prevention change nonlinearly across time. The most common solution to nonlinearity is to add a quadratic term to the linear model, but this can lead to the use of the quadratic function to explain all nonlinearity, regardless of theory and the characteristics of the variables in the model. The current study describes the problems that arise when quadratic functions are used to describe all nonlinearity and how the use of nonlinear functions, such as exponential decay, address many of these problems. In addition, nonlinear models provide several advantages over polynomial models including usefulness of parameters, parsimony, and generalizability. The effects of using nonlinear functions for mediation analysis are then discussed and a nonlinear growth curve model for mediation is presented. An empirical example using data from a randomized intervention study is then provided to illustrate the estimation and interpretation of the model. Implications, limitations, and future directions are also discussed.

  4. An Exponential Decay Model for Mediation

    PubMed Central

    Fritz, Matthew S.

    2013-01-01

    Mediation analysis is often used to investigate mechanisms of change in prevention research. Results finding mediation are strengthened when longitudinal data are used because of the need for temporal precedence. Current longitudinal mediation models have focused mainly on linear change, but many variables in prevention change nonlinearly across time. The most common solution to nonlinearity is to add a quadratic term to the linear model, but this can lead to the use of the quadratic function to explain all nonlinearity, regardless of theory and the characteristics of the variables in the model. The current study describes the problems that arise when quadratic functions are used to describe all nonlinearity and how the use of nonlinear functions, such as exponential decay, addresses many of these problems. In addition, nonlinear models provide several advantages over polynomial models including usefulness of parameters, parsimony, and generalizability. The effects of using nonlinear functions for mediation analysis are then discussed and a nonlinear growth curve model for mediation is presented. An empirical example using data from a randomized intervention study is then provided to illustrate the estimation and interpretation of the model. Implications, limitations, and future directions are also discussed. PMID:23625557

  5. Physical and cognitive effort discounting across different reward magnitudes: Tests of discounting models

    PubMed Central

    Ostaszewski, Paweł

    2017-01-01

    The effort required to obtain a rewarding outcome is an important factor in decision-making. Describing the reward devaluation by increasing effort intensity is substantial to understanding human preferences, because every action and choice that we make is in itself effortful. To investigate how reward valuation is affected by physical and cognitive effort, we compared mathematical discounting functions derived from research on discounting. Seven discounting models were tested across three different reward magnitudes. To test the models, data were collected from a total of 114 participants recruited from the general population. For one-parameter models (hyperbolic, exponential, and parabolic), the data were explained best by the exponential model as given by a percentage of explained variance. However, after introducing an additional parameter, data obtained in the cognitive and physical effort conditions were best described by the power function model. Further analysis, using the second order Akaike and Bayesian Information Criteria, which account for model complexity, allowed us to identify the best model among all tested. We found that the power function best described the data, which corresponds to conventional analyses based on the R2 measure. This supports the conclusion that the function best describing reward devaluation by physical and cognitive effort is a concave one and is different from those that describe delay or probability discounting. In addition, consistent magnitude effects were observed that correspond to those in delay discounting research. PMID:28759631

  6. New techniques for positron emission tomography in the study of human neurological disorders: Progress report, December 15, 1987-June 14, 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhl, D.E.

    1988-02-01

    A brief progress report is presented describing the preparation and animal testing of /sup 11/C scopolamine and /sup 18/F fluoride. Additional studies entitled ''Automated Arterial Blood Sampling System for PET,'' Rapid Data Analysis Schemes for Functional Imaging in PET,'' and ''Tracer Kinetic Modeling in PET Measures of Cholinergic Receptors'' are described

  7. Radio-science performance analysis software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  8. Radio-Science Performance Analysis Software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1994-10-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussion on operating the program set on Galileo and Ulysses data will be presented.

  9. Radio-science performance analysis software

    NASA Technical Reports Server (NTRS)

    Morabito, D. D.; Asmar, S. W.

    1995-01-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  10. Static Analysis Using Abstract Interpretation

    NASA Technical Reports Server (NTRS)

    Arthaud, Maxime

    2017-01-01

    Short presentation about static analysis and most particularly abstract interpretation. It starts with a brief explanation on why static analysis is used at NASA. Then, it describes the IKOS (Inference Kernel for Open Static Analyzers) tool chain. Results on NASA projects are shown. Several well known algorithms from the static analysis literature are then explained (such as pointer analyses, memory analyses, weak relational abstract domains, function summarization, etc.). It ends with interesting problems we encountered (such as C++ analysis with exception handling, or the detection of integer overflow).

  11. Digital Avionics Information System (DAIS): Training Requirements Analysis Model Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…

  12. A Review of PROC IRT in SAS

    ERIC Educational Resources Information Center

    Choi, Jinnie

    2017-01-01

    This article reviews PROC IRT, which was added to Statistical Analysis Software in 2014. We provide an introductory overview of a free version of SAS, describe what PROC IRT offers for item response theory (IRT) analysis and how one can use PROC IRT, and discuss how other SAS macros and procedures may compensate the IRT functionalities of PROC IRT.

  13. Analysis of Some Potential Manpower Policies for the All-Volunteer Navy. Final Report.

    ERIC Educational Resources Information Center

    Battelle, R. Bard; And Others

    This report describes an analysis of Navy personnel as a subsystem of the Navy, functioning with the overall objective of maintaining Fleet readiness within the constraints of budget and manpower supply limitations. Manpower utilization and management techniques and options were examined and evaluated for their usefulness to an all volunteer Navy…

  14. Sensitivity Analysis of Mixed Models for Incomplete Longitudinal Data

    ERIC Educational Resources Information Center

    Xu, Shu; Blozis, Shelley A.

    2011-01-01

    Mixed models are used for the analysis of data measured over time to study population-level change and individual differences in change characteristics. Linear and nonlinear functions may be used to describe a longitudinal response, individuals need not be observed at the same time points, and missing data, assumed to be missing at random (MAR),…

  15. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    NASA Astrophysics Data System (ADS)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  16. Neural correlates of the natural observation of an emotionally loaded video

    PubMed Central

    Gonzalez-Santos, Leopoldo

    2018-01-01

    Studies based on a paradigm of free or natural viewing have revealed characteristics that allow us to know how the brain processes stimuli within a natural environment. This method has been little used to study brain function. With a connectivity approach, we examine the processing of emotions using an exploratory method to analyze functional magnetic resonance imaging (fMRI) data. This research describes our approach to modeling stress paradigms suitable for neuroimaging environments. We showed a short film (4.54 minutes) with high negative emotional valence and high arousal content to 24 healthy male subjects (36.42 years old; SD = 12.14) during fMRI. Independent component analysis (ICA) was used to identify networks based on spatial statistical independence. Through this analysis we identified the sensorimotor system and its influence on the dorsal attention and default-mode networks, which in turn have reciprocal activity and modulate networks described as emotional. PMID:29883494

  17. Overview of computational control research at UT Austin

    NASA Technical Reports Server (NTRS)

    Bong, Wie

    1989-01-01

    An overview of current research activities at UT Austin is presented to discuss certain technical issues in the following areas: (1) Computer-Aided Nonlinear Control Design: In this project, the describing function method is employed for the nonlinear control analysis and design of a flexible spacecraft equipped with pulse modulated reaction jets. INCA program has been enhanced to allow the numerical calculation of describing functions as well as the nonlinear limit cycle analysis capability in the frequency domain; (2) Robust Linear Quadratic Gaussian (LQG) Compensator Synthesis: Robust control design techniques and software tools are developed for flexible space structures with parameter uncertainty. In particular, an interactive, robust multivariable control design capability is being developed for INCA program; and (3) LQR-Based Autonomous Control System for the Space Station: In this project, real time implementation of LQR-based autonomous control system is investigated for the space station with time-varying inertias and with significant multibody dynamic interactions.

  18. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  19. Respiratory protective device design using control system techniques

    NASA Technical Reports Server (NTRS)

    Burgess, W. A.; Yankovich, D.

    1972-01-01

    The feasibility of a control system analysis approach to provide a design base for respiratory protective devices is considered. A system design approach requires that all functions and components of the system be mathematically identified in a model of the RPD. The mathematical notations describe the operation of the components as closely as possible. The individual component mathematical descriptions are then combined to describe the complete RPD. Finally, analysis of the mathematical notation by control system theory is used to derive compensating component values that force the system to operate in a stable and predictable manner.

  20. Taking advantage of local structure descriptors to analyze interresidue contacts in protein structures and protein complexes.

    PubMed

    Martin, Juliette; Regad, Leslie; Etchebest, Catherine; Camproux, Anne-Claude

    2008-11-15

    Interresidue protein contacts in proteins structures and at protein-protein interface are classically described by the amino acid types of interacting residues and the local structural context of the contact, if any, is described using secondary structures. In this study, we present an alternate analysis of interresidue contact using local structures defined by the structural alphabet introduced by Camproux et al. This structural alphabet allows to describe a 3D structure as a sequence of prototype fragments called structural letters, of 27 different types. Each residue can then be assigned to a particular local structure, even in loop regions. The analysis of interresidue contacts within protein structures defined using Voronoï tessellations reveals that pairwise contact specificity is greater in terms of structural letters than amino acids. Using a simple heuristic based on specificity score comparison, we find that 74% of the long-range contacts within protein structures are better described using structural letters than amino acid types. The investigation is extended to a set of protein-protein complexes, showing that the similar global rules apply as for intraprotein contacts, with 64% of the interprotein contacts best described by local structures. We then present an evaluation of pairing functions integrating structural letters to decoy scoring and show that some complexes could benefit from the use of structural letter-based pairing functions.

  1. The conical scanner evaluation system design

    NASA Technical Reports Server (NTRS)

    Cumella, K. E.; Bilanow, S.; Kulikov, I. B.

    1982-01-01

    The software design for the conical scanner evaluation system is presented. The purpose of this system is to support the performance analysis of the LANDSAT-D conical scanners, which are infrared horizon detection attitude sensors designed for improved accuracy. The system consists of six functionally independent subsystems and five interface data bases. The system structure and interfaces of each of the subsystems is described and the content, format, and file structure of each of the data bases is specified. For each subsystem, the functional logic, the control parameters, the baseline structure, and each of the subroutines are described. The subroutine descriptions include a procedure definition and the input and output parameters.

  2. Development of a grid-independent approximate Riemannsolver. Ph.D. Thesis - Michigan Univ.

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher Lockwood

    1991-01-01

    A grid-independent approximate Riemann solver for use with the Euler and Navier-Stokes equations was introduced and explored. The two-dimensional Euler and Navier-Stokes equations are described in Cartesian and generalized coordinates, as well as the traveling wave form of the Euler equations. The spatial and temporal discretization are described for both explicit and implicit time-marching schemes. The grid-aligned flux function of Roe is outlined, while the 5-wave grid-independent flux function is derived. The stability and monotonicity analysis of the 5-wave model are presented. Two-dimensional results are provided and extended to three dimensions. The corresponding results are presented.

  3. Left ventricular volume analysis as a basic tool to describe cardiac function.

    PubMed

    Kerkhof, Peter L M; Kuznetsova, Tatiana; Ali, Rania; Handly, Neal

    2018-03-01

    The heart is often regarded as a compression pump. Therefore, determination of pressure and volume is essential for cardiac function analysis. Traditionally, ventricular performance was described in terms of the Starling curve, i.e., output related to input. This view is based on two variables (namely, stroke volume and end-diastolic volume), often studied in the isolated (i.e., denervated) heart, and has dominated the interpretation of cardiac mechanics over the last century. The ratio of the prevailing coordinates within that paradigm is termed ejection fraction (EF), which is the popular metric routinely used in the clinic. Here we present an insightful alternative approach while describing volume regulation by relating end-systolic volume (ESV) to end-diastolic volume. This route obviates the undesired use of metrics derived from differences or ratios, as employed in previous models. We illustrate basic principles concerning ventricular volume regulation by data obtained from intact animal experiments and collected in healthy humans. Special attention is given to sex-specific differences. The method can be applied to the dynamics of a single heart and to an ensemble of individuals. Group analysis allows for stratification regarding sex, age, medication, and additional clinically relevant covariates. A straightforward procedure derives the relationship between EF and ESV and describes myocardial oxygen consumption in terms of ESV. This representation enhances insight and reduces the impact of the metric EF, in favor of the end-systolic elastance concept advanced 4 decades ago.

  4. Piezoelectric Actuator Modeling Using MSC/NASTRAN and MATLAB

    NASA Technical Reports Server (NTRS)

    Reaves, Mercedes C.; Horta, Lucas G.

    2003-01-01

    This paper presents a procedure for modeling structures containing piezoelectric actuators using MSCMASTRAN and MATLAB. The paper describes the utility and functionality of one set of validated modeling tools. The tools described herein use MSCMASTRAN to model the structure with piezoelectric actuators and a thermally induced strain to model straining of the actuators due to an applied voltage field. MATLAB scripts are used to assemble the dynamic equations and to generate frequency response functions. The application of these tools is discussed using a cantilever aluminum beam with a surface mounted piezoelectric actuator as a sample problem. Software in the form of MSCINASTRAN DMAP input commands, MATLAB scripts, and a step-by-step procedure to solve the example problem are provided. Analysis results are generated in terms of frequency response functions from deflection and strain data as a function of input voltage to the actuator.

  5. [From nosology to functionality: the current discussion of psychopathology in the psychotherapy of adolescents].

    PubMed

    Resch, Franz; Parzer, Peter

    2014-01-01

    Starting from the question whether the psychopathology in psychotherapy has relevance, different aspects of symptoms are described: the causal model views symptoms as a necessary consequence of specific preconditions. The functional model views symptoms as behaviors aiming at the achievement of internal motives and desires. An application of the cybernetic control model on psychological processes is formulated as a theory of the control of perception. Often the goals of behavior are represented by verbal rules. Such inner verbalizations are described in detail. An attempt of a functional symptom analysis is made, according to the theory of the control of perception in a clinical context. The psychotherapeutic attempt to modify dysfunctional goals in patients aims to decouple the emotional experiences from verbalized inner rules. Functional psychopathology may be useful for treatment failure after conventional therapies, as well as in multimodal interventions (combination of pharmacotherapy and psychotherapy).

  6. A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.

    PubMed

    Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf

    2018-01-01

    Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.

  7. A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models

    PubMed Central

    Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf

    2017-01-01

    Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977

  8. DIFFERENTIAL CROSS SECTION ANALYSIS IN KAON PHOTOPRODUCTION USING ASSOCIATED LEGENDRE POLYNOMIALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. T. P. HUTAURUK, D. G. IRELAND, G. ROSNER

    2009-04-01

    Angular distributions of differential cross sections from the latest CLAS data sets,6 for the reaction γ + p→K+ + Λ have been analyzed using associated Legendre polynomials. This analysis is based upon theoretical calculations in Ref. 1 where all sixteen observables in kaon photoproduction can be classified into four Legendre classes. Each observable can be described by an expansion of associated Legendre polynomial functions. One of the questions to be addressed is how many associated Legendre polynomials are required to describe the data. In this preliminary analysis, we used data models with different numbers of associated Legendre polynomials. We thenmore » compared these models by calculating posterior probabilities of the models. We found that the CLAS data set needs no more than four associated Legendre polynomials to describe the differential cross section data. In addition, we also show the extracted coefficients of the best model.« less

  9. The most common technologies and tools for functional genome analysis.

    PubMed

    Gasperskaja, Evelina; Kučinskas, Vaidutis

    2017-01-01

    Since the sequence of the human genome is complete, the main issue is how to understand the information written in the DNA sequence. Despite numerous genome-wide studies that have already been performed, the challenge to determine the function of genes, gene products, and also their interaction is still open. As changes in the human genome are highly likely to cause pathological conditions, functional analysis is vitally important for human health. For many years there have been a variety of technologies and tools used in functional genome analysis. However, only in the past decade there has been rapid revolutionizing progress and improvement in high-throughput methods, which are ranging from traditional real-time polymerase chain reaction to more complex systems, such as next-generation sequencing or mass spectrometry. Furthermore, not only laboratory investigation, but also accurate bioinformatic analysis is required for reliable scientific results. These methods give an opportunity for accurate and comprehensive functional analysis that involves various fields of studies: genomics, epigenomics, proteomics, and interactomics. This is essential for filling the gaps in the knowledge about dynamic biological processes at both cellular and organismal level. However, each method has both advantages and limitations that should be taken into account before choosing the right method for particular research in order to ensure successful study. For this reason, the present review paper aims to describe the most frequent and widely-used methods for the comprehensive functional analysis.

  10. The heptanucleotide motif GAGACGC is a key component of a cis-acting promoter element that is critical for SnSAG1 expression in Sarcocystis neurona.

    PubMed

    Gaji, Rajshekhar Y; Howe, Daniel K

    2009-07-01

    The apicomplexan parasite Sarcocystis neurona undergoes a complex process of intracellular development, during which many genes are temporally regulated. The described study was undertaken to begin identifying the basic promoter elements that control gene expression in S. neurona. Sequence analysis of the 5'-flanking region of five S. neurona genes revealed a conserved heptanucleotide motif GAGACGC that is similar to the WGAGACG motif described upstream of multiple genes in Toxoplasma gondii. The promoter region for the major surface antigen gene SnSAG1, which contains three heptanucleotide motifs within 135 bases of the transcription start site, was dissected by functional analysis using a dual luciferase reporter assay. These analyses revealed that a minimal promoter fragment containing all three motifs was sufficient to drive reporter molecule expression, with the presence and orientation of the 5'-most heptanucleotide motif being absolutely critical for promoter function. Further studies should help to identify additional sequence elements important for promoter function and for controlling gene expression during intracellular development by this apicomplexan pathogen.

  11. Functional Analysis With a Barcoder Yeast Gene Overexpression System

    PubMed Central

    Douglas, Alison C.; Smith, Andrew M.; Sharifpoor, Sara; Yan, Zhun; Durbic, Tanja; Heisler, Lawrence E.; Lee, Anna Y.; Ryan, Owen; Göttert, Hendrikje; Surendra, Anu; van Dyk, Dewald; Giaever, Guri; Boone, Charles; Nislow, Corey; Andrews, Brenda J.

    2012-01-01

    Systematic analysis of gene overexpression phenotypes provides an insight into gene function, enzyme targets, and biological pathways. Here, we describe a novel functional genomics platform that enables a highly parallel and systematic assessment of overexpression phenotypes in pooled cultures. First, we constructed a genome-level collection of ~5100 yeast barcoder strains, each of which carries a unique barcode, enabling pooled fitness assays with a barcode microarray or sequencing readout. Second, we constructed a yeast open reading frame (ORF) galactose-induced overexpression array by generating a genome-wide set of yeast transformants, each of which carries an individual plasmid-born and sequence-verified ORF derived from the Saccharomyces cerevisiae full-length EXpression-ready (FLEX) collection. We combined these collections genetically using synthetic genetic array methodology, generating ~5100 strains, each of which is barcoded and overexpresses a specific ORF, a set we termed “barFLEX.” Additional synthetic genetic array allows the barFLEX collection to be moved into different genetic backgrounds. As a proof-of-principle, we describe the properties of the barFLEX overexpression collection and its application in synthetic dosage lethality studies under different environmental conditions. PMID:23050238

  12. Influence of combined visual and vestibular cues on human perception and control of horizontal rotation

    NASA Technical Reports Server (NTRS)

    Zacharias, G. L.; Young, L. R.

    1981-01-01

    Measurements are made of manual control performance in the closed-loop task of nulling perceived self-rotation velocity about an earth-vertical axis. Self-velocity estimation is modeled as a function of the simultaneous presentation of vestibular and peripheral visual field motion cues. Based on measured low-frequency operator behavior in three visual field environments, a parallel channel linear model is proposed which has separate visual and vestibular pathways summing in a complementary manner. A dual-input describing function analysis supports the complementary model; vestibular cues dominate sensation at higher frequencies. The describing function model is extended by the proposal of a nonlinear cue conflict model, in which cue weighting depends on the level of agreement between visual and vestibular cues.

  13. Progressing from initially ambiguous functional analyses: three case examples.

    PubMed

    Tiger, Jeffrey H; Fisher, Wayne W; Toussaint, Karen A; Kodak, Tiffany

    2009-01-01

    Most often functional analyses are initiated using a standard set of test conditions, similar to those described by Iwata, Dorsey, Slifer, Bauman, and Richman [Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauman, K. E., & Richman, G. S. (1994). Toward a functional analysis of self-injury. Journal of Applied Behavior Analysis, 27, 197-209 (Reprinted from Analysis and Intervention in Developmental Disabilities, 2, 3-20, 1982)]. These test conditions involve the careful manipulation of motivating operations, discriminative stimuli, and reinforcement contingencies to determine the events related to the occurrence and maintenance of problem behavior. Some individuals display problem behavior that is occasioned and reinforced by idiosyncratic or otherwise unique combinations of environmental antecedents and consequences of behavior, which are unlikely to be detected using these standard assessment conditions. For these individuals, modifications to the standard test conditions or the inclusion of novel test conditions may result in clearer assessment outcomes. The current study provides three case examples of individuals whose functional analyses were initially undifferentiated; however, modifications to the standard conditions resulted in the identification of behavioral functions and the implementation of effective function-based treatments.

  14. Heart failure analysis dashboard for patient's remote monitoring combining multiple artificial intelligence technologies.

    PubMed

    Guidi, G; Pettenati, M C; Miniati, R; Iadanza, E

    2012-01-01

    In this paper we describe an Heart Failure analysis Dashboard that, combined with a handy device for the automatic acquisition of a set of patient's clinical parameters, allows to support telemonitoring functions. The Dashboard's intelligent core is a Computer Decision Support System designed to assist the clinical decision of non-specialist caring personnel, and it is based on three functional parts: Diagnosis, Prognosis, and Follow-up management. Four Artificial Intelligence-based techniques are compared for providing diagnosis function: a Neural Network, a Support Vector Machine, a Classification Tree and a Fuzzy Expert System whose rules are produced by a Genetic Algorithm. State of the art algorithms are used to support a score-based prognosis function. The patient's Follow-up is used to refine the diagnosis.

  15. New real-time algorithms for arbitrary, high precision function generation with applications to acoustic transducer excitation

    NASA Astrophysics Data System (ADS)

    Gaydecki, P.

    2009-07-01

    A system is described for the design, downloading and execution of arbitrary functions, intended for use with acoustic and low-frequency ultrasonic transducers in condition monitoring and materials testing applications. The instrumentation comprises a software design tool and a powerful real-time digital signal processor unit, operating at 580 million multiplication-accumulations per second (MMACs). The embedded firmware employs both an established look-up table approach and a new function interpolation technique to generate the real-time signals with very high precision and flexibility. Using total harmonic distortion (THD) analysis, the purity of the waveforms have been compared with those generated using traditional analogue function generators; this analysis has confirmed that the new instrument has a consistently superior signal-to-noise ratio.

  16. Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Jong, Jen-Yi

    1986-01-01

    An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.

  17. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  18. Statistical inference of dynamic resting-state functional connectivity using hierarchical observation modeling.

    PubMed

    Sojoudi, Alireza; Goodyear, Bradley G

    2016-12-01

    Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Zebra: a web server for bioinformatic analysis of diverse protein families.

    PubMed

    Suplatov, Dmitry; Kirilin, Evgeny; Takhaveev, Vakil; Svedas, Vytas

    2014-01-01

    During evolution of proteins from a common ancestor, one functional property can be preserved while others can vary leading to functional diversity. A systematic study of the corresponding adaptive mutations provides a key to one of the most challenging problems of modern structural biology - understanding the impact of amino acid substitutions on protein function. The subfamily-specific positions (SSPs) are conserved within functional subfamilies but are different between them and, therefore, seem to be responsible for functional diversity in protein superfamilies. Consequently, a corresponding method to perform the bioinformatic analysis of sequence and structural data has to be implemented in the common laboratory practice to study the structure-function relationship in proteins and develop novel protein engineering strategies. This paper describes Zebra web server - a powerful remote platform that implements a novel bioinformatic analysis algorithm to study diverse protein families. It is the first application that provides specificity determinants at different levels of functional classification, therefore addressing complex functional diversity of large superfamilies. Statistical analysis is implemented to automatically select a set of highly significant SSPs to be used as hotspots for directed evolution or rational design experiments and analyzed studying the structure-function relationship. Zebra results are provided in two ways - (1) as a single all-in-one parsable text file and (2) as PyMol sessions with structural representation of SSPs. Zebra web server is available at http://biokinet.belozersky.msu.ru/zebra .

  20. Functions on the Job in Relation to Data, People, and Things among Agricultural Students from Southern Land-Grant Universities

    ERIC Educational Resources Information Center

    Zekeri, Andrew A.; Warren, Rueben

    2013-01-01

    This paper uses data from a sample of agriculture graduates from selected land-grant universities in the south to examine workers' functions on the job in relation to data, people, and things as described in the Dictionary of Occupational Titles. Tabular analysis was conducted using gamma and Pearson's correlation as measures of association.…

  1. Analysis of electromagnetic forces and causality in electron microscopy.

    PubMed

    Reyes-Coronado, Alejandro; Ortíz-Solano, Carlos Gael; Zabala, Nerea; Rivacoba, Alberto; Esquivel-Sirvent, Raúl

    2018-09-01

    The non-physical effects on the transverse momentum transfer from fast electrons to gold nanoparticles associated to the use of non-causal dielectric functions are studied. A direct test of the causality based on the surface Kramers-Kronig relations is presented. This test is applied to the different dielectric function used to describe gold nanostructures in electron microscopy. Copyright © 2018. Published by Elsevier B.V.

  2. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  3. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  4. Interactive debug program for evaluation and modification of assembly-language software

    NASA Technical Reports Server (NTRS)

    Arpasi, D. J.

    1979-01-01

    An assembly-language debug program written for the Honeywell HDC-601 and DDP-516/316 computers is described. Names and relative addressing to improve operator-machine interaction are used. Features include versatile display, on-line assembly, and improved program execution and analysis. The program is discussed from both a programmer's and an operator's standpoint. Functional diagrams are included to describe the program, and each command is illustrated.

  5. Function Model for Community Health Service Information

    NASA Astrophysics Data System (ADS)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong

    In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.

  6. A Decision Analysis Perspective on Multiple Response Robust Optimization

    DTIC Science & Technology

    2012-03-01

    the utility function in question is monotonically increasing and is twice differentiable . If γ(y) = 0, the utility function is describing risk neutral...twice differentiable , the risk aversion function with respect to a single attribute, yi, i = 1, . . . , n, is given in Equation 2.9, γUyi = − U ′′yi U...UV (V (y1, y2)) and fol- lowing the chain rule of differentiation , Matheson and Abbas [31] show that the risk aversion with respect to a single

  7. Linkage of Recognition and Replication Functions by Assembling Combinatorial Antibody Fab Libraries Along Phage Surfaces

    NASA Astrophysics Data System (ADS)

    Kang, Angray S.; Barbas, Carlos F.; Janda, Kim D.; Benkovic, Stephen J.; Lerner, Richard A.

    1991-05-01

    We describe a method based on a phagemid vector with helper phage rescue for the construction and rapid analysis of combinatorial antibody Fab libraries. This approach should allow the generation and selection of many monoclonal antibodies. Antibody genes are expressed in concert with phage morphogenesis, thereby allowing incorporation of functional Fab molecules along the surface of filamentous phage. The power of the method depends upon the linkage of recognition and replication functions and is not limited to antibody molecules.

  8. Analysis of Commuter Rail Costs and Cost Allocation Methods

    DOT National Transportation Integrated Search

    1983-07-01

    The report addresses the issues of commuter rail service costs and the compensation methods used to allocate railroad expenses to the commuter service function. The report consists of six sections. Section 1 describes the study purpose, scope, method...

  9. Sleep and Nutritional Deprivation and Performance of House Officers.

    ERIC Educational Resources Information Center

    Hawkins, Michael R.; And Others

    1985-01-01

    A study to compare cognitive functioning in acutely and chronically sleep-deprived house officers is described. A multivariate analysis of variance revealed significant deficits in primary mental tasks involving basic rote memory, language, and numeric skills. (Author/MLW)

  10. Consistency between the luminosity function of resolved millisecond pulsars and the galactic center excess

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ploeg, Harrison; Gordon, Chris; Crocker, Roland

    Fermi Large Area Telescope data reveal an excess of GeV gamma rays from the direction of the Galactic Center and bulge. Several explanations have been proposed for this excess including an unresolved population of millisecond pulsars (MSPs) and self-annihilating dark matter. It has been claimed that a key discriminant for or against the MSP explanation can be extracted from the properties of the luminosity function describing this source population. Specifically, is the luminosity function of the putative MSPs in the Galactic Center consistent with that characterizing the resolved MSPs in the Galactic disk? To investigate this we have used amore » Bayesian Markov Chain Monte Carlo to evaluate the posterior distribution of the parameters of the MSP luminosity function describing both resolved MSPs and the Galactic Center excess. At variance with some other claims, our analysis reveals that, within current uncertainties, both data sets can be well fit with the same luminosity function.« less

  11. WGCNA: an R package for weighted correlation network analysis.

    PubMed

    Langfelder, Peter; Horvath, Steve

    2008-12-29

    Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.

  12. WGCNA: an R package for weighted correlation network analysis

    PubMed Central

    Langfelder, Peter; Horvath, Steve

    2008-01-01

    Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008

  13. From data to function: functional modeling of poultry genomics data.

    PubMed

    McCarthy, F M; Lyons, E

    2013-09-01

    One of the challenges of functional genomics is to create a better understanding of the biological system being studied so that the data produced are leveraged to provide gains for agriculture, human health, and the environment. Functional modeling enables researchers to make sense of these data as it reframes a long list of genes or gene products (mRNA, ncRNA, and proteins) by grouping based upon function, be it individual molecular functions or interactions between these molecules or broader biological processes, including metabolic and signaling pathways. However, poultry researchers have been hampered by a lack of functional annotation data, tools, and training to use these data and tools. Moreover, this lack is becoming more critical as new sequencing technologies enable us to generate data not only for an increasingly diverse range of species but also individual genomes and populations of individuals. We discuss the impact of these new sequencing technologies on poultry research, with a specific focus on what functional modeling resources are available for poultry researchers. We also describe key strategies for researchers who wish to functionally model their own data, providing background information about functional modeling approaches, the data and tools to support these approaches, and the strengths and limitations of each. Specifically, we describe methods for functional analysis using Gene Ontology (GO) functional summaries, functional enrichment analysis, and pathways and network modeling. As annotation efforts begin to provide the fundamental data that underpin poultry functional modeling (such as improved gene identification, standardized gene nomenclature, temporal and spatial expression data and gene product function), tool developers are incorporating these data into new and existing tools that are used for functional modeling, and cyberinfrastructure is being developed to provide the necessary extendibility and scalability for storing and analyzing these data. This process will support the efforts of poultry researchers to make sense of their functional genomics data sets, and we provide here a starting point for researchers who wish to take advantage of these tools.

  14. Documentation of the data analysis system for the gamma ray monitor aboard OSO-H

    NASA Technical Reports Server (NTRS)

    Croteau, S.; Buck, A.; Higbie, P.; Kantauskis, J.; Foss, S.; Chupp, D.; Forrest, D. J.; Suri, A.; Gleske, I.

    1973-01-01

    The programming system is presented which was developed to prepare the data from the gamma ray monitor on OSO-7 for scientific analysis. The detector, data, and objectives are described in detail. Programs presented include; FEEDER, PASS-1, CAL1, CAL2, PASS-3, Van Allen Belt Predict Program, Computation Center Plot Routine, and Response Function Programs.

  15. Modeling games from the 20th century

    PubMed Central

    Killeen, P.R.

    2008-01-01

    A scientific framework is described in which scientists are cast as problem-solvers, and problems as solved when data are mapped to models. This endeavor is limited by finite attentional capacity which keeps depth of understanding complementary to breadth of vision; and which distinguishes the process of science from its products, scientists from scholars. All four aspects of explanation described by Aristotle trigger, function, substrate, and model are required for comprehension. Various modeling languages are described, ranging from set theory to calculus of variations, along with exemplary applications in behavior analysis. PMID:11369459

  16. Evolution of human brain functions: the functional structure of human consciousness.

    PubMed

    Cloninger, C Robert

    2009-11-01

    The functional structure of self-aware consciousness in human beings is described based on the evolution of human brain functions. Prior work on heritable temperament and character traits is extended to account for the quantum-like and holographic properties (i.e. parts elicit wholes) of self-aware consciousness. Cladistic analysis is used to identify the succession of ancestors leading to human beings. The functional capacities that emerge along this lineage of ancestors are described. The ecological context in which each cladogenesis occurred is described to illustrate the shifting balance of evolution as a complex adaptive system. Comparative neuroanatomy is reviewed to identify the brain structures and networks that emerged coincident with the emergent brain functions. Individual differences in human temperament traits were well developed in the common ancestor shared by reptiles and humans. Neocortical development in mammals proceeded in five major transitions: from early reptiles to early mammals, early primates, simians, early Homo, and modern Homo sapiens. These transitions provide the foundation for human self-awareness related to sexuality, materiality, emotionality, intellectuality, and spirituality, respectively. The functional structure of human self-aware consciousness is concerned with the regulation of five planes of being: sexuality, materiality, emotionality, intellectuality, and spirituality. Each plane elaborates neocortical functions organized around one of the five special senses. The interactions among these five planes gives rise to a 5 x 5 matrix of subplanes, which are functions that coarsely describe the focus of neocortical regulation. Each of these 25 neocortical functions regulates each of five basic motives or drives that can be measured as temperaments or basic emotions related to fear, anger, disgust, surprise, and happiness/sadness. The resulting 5 x 5 x 5 matrix of human characteristics provides a general and testable model of the functional structure of human consciousness that includes personality, physicality, emotionality, cognition, and spirituality in a unified developmental framework.

  17. Rethinking plant functional types in Earth System Models: pan-tropical analysis of tree survival across environmental gradients

    NASA Astrophysics Data System (ADS)

    Johnson, D. J.; Needham, J.; Xu, C.; Davies, S. J.; Bunyavejchewin, S.; Giardina, C. P.; Condit, R.; Cordell, S.; Litton, C. M.; Hubbell, S.; Kassim, A. R. B.; Shawn, L. K. Y.; Nasardin, M. B.; Ong, P.; Ostertag, R.; Sack, L.; Tan, S. K. S.; Yap, S.; McDowell, N. G.; McMahon, S.

    2016-12-01

    Terrestrial carbon cycling is a function of the growth and survival of trees. Current model representations of tree growth and survival at a global scale rely on coarse plant functional traits that are parameterized very generally. In view of the large biodiversity in the tropical forests, it is important that we account for the functional diversity in order to better predict tropical forest responses to future climate changes. Several next generation Earth System Models are moving towards a size-structured, trait-based approach to modelling vegetation globally, but the challenge of which and how many traits are necessary to capture forest complexity remains. Additionally, the challenge of collecting sufficient trait data to describe the vast species richness of tropical forests is enormous. We propose a more fundamental approach to these problems by characterizing forests by their patterns of survival. We expect our approach to distill real-world tree survival into a reasonable number of functional types. Using 10 large-area tropical forest plots that span geographic, edaphic and climatic gradients, we model tree survival as a function of tree size for hundreds of species. We found surprisingly few categories of size-survival functions emerge. This indicates some fundamental strategies at play across diverse forests to constrain the range of possible size-survival functions. Initial cluster analysis indicates that four to eight functional forms are necessary to describe variation in size-survival relations. Temporal variation in size-survival functions can be related to local environmental variation, allowing us to parameterize how demographically similar groups of species respond to perturbations in the ecosystem. We believe this methodology will yield a synthetic approach to classifying forest systems that will greatly reduce uncertainty and complexity in global vegetation models.

  18. Modelling the occurrence and severity of enoxaparin-induced bleeding and bruising events

    PubMed Central

    Barras, Michael A; Duffull, Stephen B; Atherton, John J; Green, Bruce

    2009-01-01

    AIMS To develop a population pharmacokinetic–pharmacodynamic model to describe the occurrence and severity of bleeding or bruising as a function of enoxaparin exposure. METHODS Data were obtained from a randomized controlled trial (n = 118) that compared conventional dosing of enoxaparin (product label) with an individualized dosing regimen. Anti-Xa concentrations were sampled using a sparse design and the size, location and type of bruising and bleeding event, during enoxaparin therapy, were collected daily. A population pharmacokinetic–pharmacodynamic analysis was performed using nonlinear mixed effects techniques. The final model was used to explore how the probability of events in patients with obesity and/or renal impairment varied under differing dosing strategies. RESULTS Three hundred and forty-nine anti-Xa concentrations were available for analysis. A two-compartment first-order absorption and elimination model best fit the data, with lean body weight describing between-subject variability in clearance and central volume of distribution. A three-category proportional-odds model described the occurrence and severity of events as a function of both cumulative enoxaparin AUC (cAUC) and subject age. Simulations showed that individualized dosing decreased the probability of a bleeding or major bruising event when compared with conventional dosing, which was most noticeable in subjects with obesity and renal impairment. CONCLUSIONS The occurrence and severity of a bleeding or major bruising event to enoxaparin, administered for the treatment of a thromboembolic disease, can be described as a function of both cAUC and subject age. Individualized dosing of enoxaparin will reduce the probability of an event. PMID:19916994

  19. The Necessity of Functional Analysis for Space Exploration Programs

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Breidenthal, Julian C.

    2011-01-01

    As NASA moves toward expanded commercial spaceflight within its human exploration capability, there is increased emphasis on how to allocate responsibilities between government and commercial organizations to achieve coordinated program objectives. The practice of program-level functional analysis offers an opportunity for improved understanding of collaborative functions among heterogeneous partners. Functional analysis is contrasted with the physical analysis more commonly done at the program level, and is shown to provide theoretical performance, risk, and safety advantages beneficial to a government-commercial partnership. Performance advantages include faster convergence to acceptable system solutions; discovery of superior solutions with higher commonality, greater simplicity and greater parallelism by substituting functional for physical redundancy to achieve robustness and safety goals; and greater organizational cohesion around program objectives. Risk advantages include avoidance of rework by revelation of some kinds of architectural and contractual mismatches before systems are specified, designed, constructed, or integrated; avoidance of cost and schedule growth by more complete and precise specifications of cost and schedule estimates; and higher likelihood of successful integration on the first try. Safety advantages include effective delineation of must-work and must-not-work functions for integrated hazard analysis, the ability to formally demonstrate completeness of safety analyses, and provably correct logic for certification of flight readiness. The key mechanism for realizing these benefits is the development of an inter-functional architecture at the program level, which reveals relationships between top-level system requirements that would otherwise be invisible using only a physical architecture. This paper describes the advantages and pitfalls of functional analysis as a means of coordinating the actions of large heterogeneous organizations for space exploration programs.

  20. Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain

    PubMed Central

    Quinkert, Amy Wells; Vimal, Vivek; Weil, Zachary M.; Reeke, George N.; Schiff, Nicholas D.; Banavar, Jayanth R.; Pfaff, Donald W.

    2011-01-01

    We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA. PMID:21555568

  1. Sensor Authentication: Embedded Processor Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svoboda, John

    2012-09-25

    Described is the c code running on the embedded Microchip 32bit PIC32MX575F256H located on the INL developed noise analysis circuit board. The code performs the following functions: Controls the noise analysis circuit board preamplifier voltage gains of 1, 10, 100, 000 Initializes the analog to digital conversion hardware, input channel selection, Fast Fourier Transform (FFT) function, USB communications interface, and internal memory allocations Initiates high resolution 4096 point 200 kHz data acquisition Computes complex 2048 point FFT and FFT magnitude. Services Host command set Transfers raw data to Host Transfers FFT result to host Communication error checking

  2. The Secrets of Scheherazade: Toward a Functional Analysis of Imaginative Literature

    PubMed Central

    Grant, Lyle K

    2005-01-01

    A functional analysis of selected aspects of imaginative literature is presented. Reading imaginative literature is described as a process in which the reader makes indirect contact with the contingencies operating on the behavior of story characters. A functional story grammar is proposed in which the reader's experience with a story is interpreted in terms of escape contingencies in which the author initially introduces an establishing operation consisting of a source of tension, which is resolved in some way by the outcome of the story. Although escape contingencies represent the functional basis for the structure of stories, they are to be understood in a context of many other reinforcers for reading fiction. Other contingencies that maintain reading are discussed. Functional analyses of imaginative literature have much to offer, both in improving literary education and in understanding the behavioral processes that occur on the part of the reader. PMID:22477324

  3. Development of a universal measure of quadrupedal forelimb-hindlimb coordination using digital motion capture and computerised analysis.

    PubMed

    Hamilton, Lindsay; Franklin, Robin J M; Jeffery, Nick D

    2007-09-18

    Clinical spinal cord injury in domestic dogs provides a model population in which to test the efficacy of putative therapeutic interventions for human spinal cord injury. To achieve this potential a robust method of functional analysis is required so that statistical comparison of numerical data derived from treated and control animals can be achieved. In this study we describe the use of digital motion capture equipment combined with mathematical analysis to derive a simple quantitative parameter - 'the mean diagonal coupling interval' - to describe coordination between forelimb and hindlimb movement. In normal dogs this parameter is independent of size, conformation, speed of walking or gait pattern. We show here that mean diagonal coupling interval is highly sensitive to alterations in forelimb-hindlimb coordination in dogs that have suffered spinal cord injury, and can be accurately quantified, but is unaffected by orthopaedic perturbations of gait. Mean diagonal coupling interval is an easily derived, highly robust measurement that provides an ideal method to compare the functional effect of therapeutic interventions after spinal cord injury in quadrupeds.

  4. Autonomous control systems - Architecture and fundamental issues

    NASA Technical Reports Server (NTRS)

    Antsaklis, P. J.; Passino, K. M.; Wang, S. J.

    1988-01-01

    A hierarchical functional autonomous controller architecture is introduced. In particular, the architecture for the control of future space vehicles is described in detail; it is designed to ensure the autonomous operation of the control system and it allows interaction with the pilot and crew/ground station, and the systems on board the autonomous vehicle. The fundamental issues in autonomous control system modeling and analysis are discussed. It is proposed to utilize a hybrid approach to modeling and analysis of autonomous systems. This will incorporate conventional control methods based on differential equations and techniques for the analysis of systems described with a symbolic formalism. In this way, the theory of conventional control can be fully utilized. It is stressed that autonomy is the design requirement and intelligent control methods appear at present, to offer some of the necessary tools to achieve autonomy. A conventional approach may evolve and replace some or all of the `intelligent' functions. It is shown that in addition to conventional controllers, the autonomous control system incorporates planning, learning, and FDI (fault detection and identification).

  5. Growth Patterns of Neuropsychological Functions in Indian Children

    PubMed Central

    Kar, Bhoomika R.; Rao, Shobini L.; Chandramouli, B. A.; Thennarasu, K.

    2011-01-01

    We investigated age-related differences in neuropsychological performance in 400 Indian school children (5–15 years of age). Functions of motor speed, attention, executive functions, visuospatial functions, comprehension, learning, and memory were examined. Growth curve analysis was performed. Different growth models fitted different cognitive functions. Neuropsychological task performance improved slowly between 5 and 7 years, moderately between 8 and 12 years and slowly between 13 and 15 years of age. The overall growth patterns of neuropsychological functions in Indian children have been discussed with the findings reported on American children. The present work describes non-linear, heterogeneous, and protracted age trends of neuropsychological functions in Indian children and adolescents. PMID:22053158

  6. Assessing Many-Body Effects of Water Self-Ions. I: OH-(H2O) n Clusters.

    PubMed

    Egan, Colin K; Paesani, Francesco

    2018-04-10

    The importance of many-body effects in the hydration of the hydroxide ion (OH - ) is investigated through a systematic analysis of the many-body expansion of the interaction energy carried out at the CCSD(T) level of theory, extrapolated to the complete basis set limit, for the low-lying isomers of OH - (H 2 O) n clusters, with n = 1-5. This is accomplished by partitioning individual fragments extracted from the whole clusters into "groups" that are classified by both the number of OH - and water molecules and the hydrogen bonding connectivity within each fragment. With the aid of the absolutely localized molecular orbital energy decomposition analysis (ALMO-EDA) method, this structure-based partitioning is found to largely correlate with the character of different many-body interactions, such as cooperative and anticooperative hydrogen bonding, within each fragment. This analysis emphasizes the importance of a many-body representation of inductive electrostatics and charge transfer in modeling OH - hydration. Furthermore, the rapid convergence of the many-body expansion of the interaction energy also suggests a rigorous path for the development of analytical potential energy functions capable of describing individual OH - -water many-body terms, with chemical accuracy. Finally, a comparison between the reference CCSD(T) many-body interaction terms with the corresponding values obtained with various exchange-correlation functionals demonstrates that range-separated, dispersion-corrected, hybrid functionals exhibit the highest accuracy, while GGA functionals, with or without dispersion corrections, are inadequate to describe OH - -water interactions.

  7. Dissecting Transcriptional Heterogeneity in Pluripotency: Single Cell Analysis of Mouse Embryonic Stem Cells.

    PubMed

    Guedes, Ana M V; Henrique, Domingos; Abranches, Elsa

    2016-01-01

    Mouse Embryonic Stem cells (mESCs) show heterogeneous and dynamic expression of important pluripotency regulatory factors. Single-cell analysis has revealed the existence of cell-to-cell variability in the expression of individual genes in mESCs. Understanding how these heterogeneities are regulated and what their functional consequences are is crucial to obtain a more comprehensive view of the pluripotent state.In this chapter we describe how to analyze transcriptional heterogeneity by monitoring gene expression of Nanog, Oct4, and Sox2, using single-molecule RNA FISH in single mESCs grown in different cell culture medium. We describe in detail all the steps involved in the protocol, from RNA detection to image acquisition and processing, as well as exploratory data analysis.

  8. Real-time automated failure analysis for on-orbit operations

    NASA Technical Reports Server (NTRS)

    Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James

    1993-01-01

    A system which is to provide real-time failure analysis support to controllers at the NASA Johnson Space Center Control Center Complex (CCC) for both Space Station and Space Shuttle on-orbit operations is described. The system employs monitored systems' models of failure behavior and model evaluation algorithms which are domain-independent. These failure models are viewed as a stepping stone to more robust algorithms operating over models of intended function. The described system is designed to meet two sets of requirements. It must provide a useful failure analysis capability enhancement to the mission controller. It must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation. The underlying technology and how it may be used to support operations is also discussed.

  9. Arc_Mat: a Matlab-based spatial data analysis toolbox

    NASA Astrophysics Data System (ADS)

    Liu, Xingjian; Lesage, James

    2010-03-01

    This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.

  10. TAM 2.0: tool for MicroRNA set analysis.

    PubMed

    Li, Jianwei; Han, Xiaofen; Wan, Yanping; Zhang, Shan; Zhao, Yingshu; Fan, Rui; Cui, Qinghua; Zhou, Yuan

    2018-06-06

    With the rapid accumulation of high-throughput microRNA (miRNA) expression profile, the up-to-date resource for analyzing the functional and disease associations of miRNAs is increasingly demanded. We here describe the updated server TAM 2.0 for miRNA set enrichment analysis. Through manual curation of over 9000 papers, a more than two-fold growth of reference miRNA sets has been achieved in comparison with previous TAM, which covers 9945 and 1584 newly collected miRNA-disease and miRNA-function associations, respectively. Moreover, TAM 2.0 allows users not only to test the functional and disease annotations of miRNAs by overrepresentation analysis, but also to compare the input de-regulated miRNAs with those de-regulated in other disease conditions via correlation analysis. Finally, the functions for miRNA set query and result visualization are also enabled in the TAM 2.0 server to facilitate the community. The TAM 2.0 web server is freely accessible at http://www.scse.hebut.edu.cn/tam/ or http://www.lirmed.com/tam2/.

  11. Identification of general risk-management countermeasures for unsafe driving actions. Volume 1, Description and analysis of promising countermeasures

    DOT National Transportation Integrated Search

    1981-02-01

    A series of general risk-management countermeasures for speed Unsafe Driving Actions (UDAs) are described. First, countermeasure elements in three functional areas, detection, information, and action, are identified. Three comprehensive countermeasur...

  12. Computational methods to predict railcar response to track cross-level variations

    DOT National Transportation Integrated Search

    1976-09-01

    The rocking response of railroad freight cars to track cross-level variations is studied using: (1) a reduced complexity digital simulation model, and (2) a quasi-linear describing function analysis. The reduced complexity digital simulation model em...

  13. Experimental Analysis of hFACT Action during Pol II Transcription in vitro

    PubMed Central

    Hsieh, Fu-Kai; Kulaeva, Olga I.; Studitsky, Vasily M.

    2016-01-01

    Summary FACT (facilitates chromatin transcription) is a histone chaperone that facilitates transcription through chromatin and promotes histone recovery during transcription. Here, we describe a highly purified experimental system that recapitulates many important properties of transcribed chromatin and the key aspects of hFACT action during this process in vitro. We present the protocols describing how to prepare different forms of nucleosomes, including intact nucleosome, covalently conjugated nucleosome, nucleosome missing one of the two H2A/2B dimers (hexasome) and tetrasome (a nucleosome missing both H2A/2B dimers). These complexes allow analysis of various aspects of FACT’s function. These approaches and other methods described below can also be applied to the study of other chromatin remodelers and chromatin-targeted factors. PMID:25665573

  14. Conceptual framework on the application of biomechanical measurement methods in driving behavior study

    NASA Astrophysics Data System (ADS)

    Sanjaya, Kadek Heri; Sya'bana, Yukhi Mustaqim Kusuma

    2017-01-01

    Research on eco-friendly vehicle development in Indonesia has largely neglected ergonomic study, despite the fact that traffic accidents have resulted in greater economic cost than fuel subsidy. We have performed a biomechanical experiment on human locomotion earlier. In this article, we describe the importance of implementing the biomechanical measurement methods in transportation ergonomic study. The instruments such as electromyogram (EMG), load cell, pressure sensor, and motion analysis methods as well as cross-correlation function analysis were explained, then the possibility of their application in driving behavior study is described. We describe the potentials and challenges of the biomechanical methods concerning the future vehicle development. The methods provide greater advantages in objective and accurate measurement not only in human task performance but also its correlation with vehicle performance.

  15. Assessment of protein set coherence using functional annotations

    PubMed Central

    Chagoyen, Monica; Carazo, Jose M; Pascual-Montano, Alberto

    2008-01-01

    Background Analysis of large-scale experimental datasets frequently produces one or more sets of proteins that are subsequently mined for functional interpretation and validation. To this end, a number of computational methods have been devised that rely on the analysis of functional annotations. Although current methods provide valuable information (e.g. significantly enriched annotations, pairwise functional similarities), they do not specifically measure the degree of homogeneity of a protein set. Results In this work we present a method that scores the degree of functional homogeneity, or coherence, of a set of proteins on the basis of the global similarity of their functional annotations. The method uses statistical hypothesis testing to assess the significance of the set in the context of the functional space of a reference set. As such, it can be used as a first step in the validation of sets expected to be homogeneous prior to further functional interpretation. Conclusion We evaluate our method by analysing known biologically relevant sets as well as random ones. The known relevant sets comprise macromolecular complexes, cellular components and pathways described for Saccharomyces cerevisiae, which are mostly significantly coherent. Finally, we illustrate the usefulness of our approach for validating 'functional modules' obtained from computational analysis of protein-protein interaction networks. Matlab code and supplementary data are available at PMID:18937846

  16. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    PubMed

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  17. Intelligent interface design and evaluation

    NASA Technical Reports Server (NTRS)

    Greitzer, Frank L.

    1988-01-01

    Intelligent interface concepts and systematic approaches to assessing their functionality are discussed. Four general features of intelligent interfaces are described: interaction efficiency, subtask automation, context sensitivity, and use of an appropriate design metaphor. Three evaluation methods are discussed: Functional Analysis, Part-Task Evaluation, and Operational Testing. Design and evaluation concepts are illustrated with examples from a prototype expert system interface for environmental control and life support systems for manned space platforms.

  18. Parcellating an Individual Subject's Cortical and Subcortical Brain Structures Using Snowball Sampling of Resting-State Correlations

    PubMed Central

    Wig, Gagan S.; Laumann, Timothy O.; Cohen, Alexander L.; Power, Jonathan D.; Nelson, Steven M.; Glasser, Matthew F.; Miezin, Francis M.; Snyder, Abraham Z.; Schlaggar, Bradley L.; Petersen, Steven E.

    2014-01-01

    We describe methods for parcellating an individual subject's cortical and subcortical brain structures using resting-state functional correlations (RSFCs). Inspired by approaches from social network analysis, we first describe the application of snowball sampling on RSFC data (RSFC-Snowballing) to identify the centers of cortical areas, subdivisions of subcortical nuclei, and the cerebellum. RSFC-Snowballing parcellation is then compared with parcellation derived from identifying locations where RSFC maps exhibit abrupt transitions (RSFC-Boundary Mapping). RSFC-Snowballing and RSFC-Boundary Mapping largely complement one another, but also provide unique parcellation information; together, the methods identify independent entities with distinct functional correlations across many cortical and subcortical locations in the brain. RSFC parcellation is relatively reliable within a subject scanned across multiple days, and while the locations of many area centers and boundaries appear to exhibit considerable overlap across subjects, there is also cross-subject variability—reinforcing the motivation to parcellate brains at the level of individuals. Finally, examination of a large meta-analysis of task-evoked functional magnetic resonance imaging data reveals that area centers defined by task-evoked activity exhibit correspondence with area centers defined by RSFC-Snowballing. This observation provides important evidence for the ability of RSFC to parcellate broad expanses of an individual's brain into functionally meaningful units. PMID:23476025

  19. Analysis of wave propagation and wavefront sensing in target-in-the-loop beam control systems

    NASA Astrophysics Data System (ADS)

    Vorontsov, Mikhail A.; Kolosov, Valeri V.

    2004-10-01

    Target-in-the-loop (TIL) wave propagation geometry represents perhaps the most challenging case for adaptive optics applications that are related with maximization of irradiance power density on extended remotely located surfaces in the presence of dynamically changing refractive index inhomogeneities in the propagation medium. We introduce a TIL propagation model that uses a combination of the parabolic equation describing outgoing wave propagation, and the equation describing evolution of the mutual intensity function (MIF) for the backscattered (returned) wave. The resulting evolution equation for the MIF is further simplified by the use of the smooth refractive index approximation. This approximation enables derivation of the transport equation for the returned wave brightness function, analyzed here using method characteristics (brightness function trajectories). The equations for the brightness function trajectories (ray equations) can be efficiently integrated numerically. We also consider wavefront sensors that perform sensing of speckle-averaged characteristics of the wavefront phase (TIL sensors). Analysis of the wavefront phase reconstructed from Shack-Hartmann TIL sensor measurements shows that an extended target introduces a phase modulation (target-induced phase) that cannot be easily separated from the atmospheric turbulence-related phase aberrations. We also show that wavefront sensing results depend on the extended target shape, surface roughness, and the outgoing beam intensity distribution on the target surface.

  20. Image Analysis of DNA Fiber and Nucleus in Plants.

    PubMed

    Ohmido, Nobuko; Wako, Toshiyuki; Kato, Seiji; Fukui, Kiichi

    2016-01-01

    Advances in cytology have led to the application of a wide range of visualization methods in plant genome studies. Image analysis methods are indispensable tools where morphology, density, and color play important roles in the biological systems. Visualization and image analysis methods are useful techniques in the analyses of the detailed structure and function of extended DNA fibers (EDFs) and interphase nuclei. The EDF is the highest in the spatial resolving power to reveal genome structure and it can be used for physical mapping, especially for closely located genes and tandemly repeated sequences. One the other hand, analyzing nuclear DNA and proteins would reveal nuclear structure and functions. In this chapter, we describe the image analysis protocol for quantitatively analyzing different types of plant genome, EDFs and interphase nuclei.

  1. Functional profile of black spruce wetlands in Alaska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Post, R.A.

    1996-09-01

    The profile describes the ecologic context and wetland functions of black spruce (Picea mariana) wetlands (BSWs) covering about 14 million ha of Alaska taiga. Ecologic descriptions include climate, permafrost, landforms, post-Pleistocene vegetation, fire, successional processes, black spruce community types and adaptations, and characteristics of BSWs. The profile describes human activities potentially affecting BSWs and identifies research literature and data gaps generally applicable to BSWs. Hydrologic, water quality, global biogeochemical, and ecologic functions of BSWs, as well as their socioeconomic uses, appear in the profile, along with potential functional indicators, expected sensitivities of functions to fill placement or weltand drainage, andmore » potential mitigation strategies for impacts. Functional analysis separately considers ombrotrophic and minerotrophic BSWs where appropriate. Depending on trophic status, Alaska`s BSWs perform several low-magnitude hydrologic (groundwater discharge and recharge, flow regulation, and erosion control) and ecologic (nutrient export, nutrient cycling, and food-chain support) functions and several substantial water quality (sediment retention, nutrient transformation, nutrient uptake, and contaminant removal), global biogeochemical (carbon cycling and storage), and ecologic (avian and mammalian habitat) functions. BSWs also provide important socioeconomic uses: harvested of wetland-dependent fish, wildlife, and plant resources and active winter recreation.« less

  2. Dynamics of modulated beams in spectral domain

    DOE PAGES

    Yampolsky, Nikolai A.

    2017-07-16

    General formalism for describing dynamics of modulated beams along linear beamlines is developed. We describe modulated beams with spectral distribution function which represents Fourier transform of the conventional beam distribution function in the 6-dimensional phase space. The introduced spectral distribution function is localized in some region of the spectral domain for nearly monochromatic modulations. It can be characterized with a small number of typical parameters such as the lowest order moments of the spectral distribution. We study evolution of the modulated beams in linear beamlines and find that characteristic spectral parameters transform linearly. The developed approach significantly simplifies analysis ofmore » various schemes proposed for seeding X-ray free electron lasers. We use this approach to study several recently proposed schemes and find the bandwidth of the output bunching in each case.« less

  3. Identification of functional corridors with movement characteristics of brown bears on the Kenai Peninsula, Alaska

    USGS Publications Warehouse

    Graves, T.A.; Farley, S.; Goldstein, M.I.; Servheen, C.

    2007-01-01

    We identified primary habitat and functional corridors across a landscape using Global Positioning System (GPS) collar locations of brown bears (Ursus arctos). After deriving density, speed, and angular deviation of movement, we classified landscape function for a group of animals with a cluster analysis. We described areas with high amounts of sinuous movement as primary habitat patches and areas with high amounts of very directional, fast movement as highly functional bear corridors. The time between bear locations and scale of analysis influenced the number and size of corridors identified. Bear locations should be collected at intervals ???6 h to correctly identify travel corridors. Our corridor identification technique will help managers move beyond the theoretical discussion of corridors and linkage zones to active management of landscape features that will preserve connectivity. ?? 2007 Springer Science+Business Media, Inc.

  4. Program for the analysis of time series. [by means of fast Fourier transform algorithm

    NASA Technical Reports Server (NTRS)

    Brown, T. J.; Brown, C. G.; Hardin, J. C.

    1974-01-01

    A digital computer program for the Fourier analysis of discrete time data is described. The program was designed to handle multiple channels of digitized data on general purpose computer systems. It is written, primarily, in a version of FORTRAN 2 currently in use on CDC 6000 series computers. Some small portions are written in CDC COMPASS, an assembler level code. However, functional descriptions of these portions are provided so that the program may be adapted for use on any facility possessing a FORTRAN compiler and random-access capability. Properly formatted digital data are windowed and analyzed by means of a fast Fourier transform algorithm to generate the following functions: (1) auto and/or cross power spectra, (2) autocorrelations and/or cross correlations, (3) Fourier coefficients, (4) coherence functions, (5) transfer functions, and (6) histograms.

  5. Structural study, NCA, FT-IR, FT-Raman spectral investigations, NBO analysis, thermodynamic functions of N-acetyl-l-phenylalanine.

    PubMed

    Raja, B; Balachandran, V; Revathi, B

    2015-03-05

    The FT-IR and FT-Raman spectra of N-acetyl-l-phenylalanine were recorded and analyzed. Natural bond orbital analysis has been carried out for various intramolecular interactions that are responsible for the stabilization of the molecule. HOMO-LUMO energy gap has been computed with the help of density functional theory. The statistical thermodynamic functions (heat capacity, entropy, vibrational partition function and Gibbs energy) were obtained for the range of temperature 100-1000K. The polarizability, first hyperpolarizability, anisotropy polarizability invariant has been computed using quantum chemical calculations. The infrared and Raman spectra were also predicted from the calculated intensities. Comparison of the experimental and theoretical spectra values provides important information about the ability of the computational method to describe the vibrational modes. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Analysis of photographic X-ray images. [S-054 telescope on Skylab

    NASA Technical Reports Server (NTRS)

    Krieger, A. S.

    1977-01-01

    Some techniques used to extract quantitative data from the information contained in photographic images produced by grazing incidence soft X-ray optical systems are described. The discussion is focussed on the analysis of the data returned by the S-054 X-Ray Spectrographic Telescope Experiment on Skylab. The parameters of the instrument and the procedures used for its calibration are described. The technique used to convert photographic density to focal plane X-ray irradiance is outlined. The deconvolution of the telescope point response function from the image data is discussed. Methods of estimating the temperature, pressure, and number density of coronal plasmas are outlined.

  7. HRLSim: a high performance spiking neural network simulator for GPGPU clusters.

    PubMed

    Minkovich, Kirill; Thibeault, Corey M; O'Brien, Michael John; Nogin, Aleksey; Cho, Youngkwan; Srinivasa, Narayan

    2014-02-01

    Modeling of large-scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRL Spiking Simulator (HRLSim). This simulator is suitable for implementation on a cluster of general purpose graphical processing units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its performance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power, HRLSim offers an affordable and scalable tool for design, real-time simulation, and analysis of large-scale spiking neural networks.

  8. Causal transfer function analysis to describe closed loop interactions between cardiovascular and cardiorespiratory variability signals.

    PubMed

    Faes, L; Porta, A; Cucino, R; Cerutti, S; Antolini, R; Nollo, G

    2004-06-01

    Although the concept of transfer function is intrinsically related to an input-output relationship, the traditional and widely used estimation method merges both feedback and feedforward interactions between the two analyzed signals. This limitation may endanger the reliability of transfer function analysis in biological systems characterized by closed loop interactions. In this study, a method for estimating the transfer function between closed loop interacting signals was proposed and validated in the field of cardiovascular and cardiorespiratory variability. The two analyzed signals x and y were described by a bivariate autoregressive model, and the causal transfer function from x to y was estimated after imposing causality by setting to zero the model coefficients representative of the reverse effects from y to x. The method was tested in simulations reproducing linear open and closed loop interactions, showing a better adherence of the causal transfer function to the theoretical curves with respect to the traditional approach in presence of non-negligible reverse effects. It was then applied in ten healthy young subjects to characterize the transfer functions from respiration to heart period (RR interval) and to systolic arterial pressure (SAP), and from SAP to RR interval. In the first two cases, the causal and non-causal transfer function estimates were comparable, indicating that respiration, acting as exogenous signal, sets an open loop relationship upon SAP and RR interval. On the contrary, causal and traditional transfer functions from SAP to RR were significantly different, suggesting the presence of a considerable influence on the opposite causal direction. Thus, the proposed causal approach seems to be appropriate for the estimation of parameters, like the gain and the phase lag from SAP to RR interval, which have a large clinical and physiological relevance.

  9. Fraction number of trapped atoms and velocity distribution function in sub-recoil laser cooling scheme

    NASA Astrophysics Data System (ADS)

    Alekseev, V. A.; Krylova, D. D.

    1996-02-01

    The analytical investigation of Bloch equations is used to describe the main features of the 1D velocity selective coherent population trapping cooling scheme. For the initial stage of cooling the fraction of cooled atoms is derived in the case of a Gaussian initial velocity distribution. At very long times of interaction the fraction of cooled atoms and the velocity distribution function are described by simple analytical formulae and do not depend on the initial distribution. These results are in good agreement with those of Bardou, Bouchaud, Emile, Aspect and Cohen-Tannoudji based on statistical analysis in terms of Levy flights and with Monte-Carlo simulations of the process.

  10. An analysis of electronic document management in oncology care.

    PubMed

    Poulter, Thomas; Gannon, Brian; Bath, Peter A

    2012-06-01

    In this research in progress, a reference model for the use of electronic patient record (EPR) systems in oncology is described. The model, termed CICERO, comprises technical and functional components, and emphasises usability, clinical safety and user acceptance. One of the functional components of the model-an electronic document and records management (EDRM) system-is monitored in the course of its deployment at a leading oncology centre in the UK. Specifically, the user requirements and design of the EDRM solution are described.The study is interpretative and forms part a wider research programme to define and validate the CICERO model. Preliminary conclusions confirm the importance of a socio-technical perspective in Onco-EPR system design.

  11. Designing for human presence in space: An introduction to environmental control and life support systems

    NASA Technical Reports Server (NTRS)

    Wieland, Paul

    1994-01-01

    Human exploration and utilization of space requires habitats to provide appropriate conditions for working and living. These conditions are provided by environmental control and life support systems (ECLSS) that ensure appropriate atmosphere composition, pressure, and temperature; manage and distribute water, process waste matter, provide fire detection and suppression; and other functions as necessary. The functions that are performed by ECLSS are described and basic information necessary to design an ECLSS is provided. Technical and programmatic aspects of designing and developing ECLSS for space habitats are described including descriptions of technologies, analysis methods, test requirements, program organization, documentation requirements, and the requirements imposed by medical, mission, safety, and system needs. The design and development process is described from initial trade studies through system-level analyses to support operation. ECLSS needs for future space habitats are also described. Extensive listings of references and related works provide sources for more detailed information on each aspect of ECLSS design and development.

  12. Proteomic Analysis of the Arabidopsis Nucleolus Suggests Novel Nucleolar FunctionsD⃞

    PubMed Central

    Pendle, Alison F.; Clark, Gillian P.; Boon, Reinier; Lewandowska, Dominika; Lam, Yun Wah; Andersen, Jens; Mann, Matthias; Lamond, Angus I.; Brown, John W. S.; Shaw, Peter J.

    2005-01-01

    The eukaryotic nucleolus is involved in ribosome biogenesis and a wide range of other RNA metabolism and cellular functions. An important step in the functional analysis of the nucleolus is to determine the complement of proteins of this nuclear compartment. Here, we describe the first proteomic analysis of plant (Arabidopsis thaliana) nucleoli, in which we have identified 217 proteins. This allows a direct comparison of the proteomes of an important nuclear structure between two widely divergent species: human and Arabidopsis. The comparison identified many common proteins, plant-specific proteins, proteins of unknown function found in both proteomes, and proteins that were nucleolar in plants but nonnucleolar in human. Seventy-two proteins were expressed as GFP fusions and 87% showed nucleolar or nucleolar-associated localization. In a striking and unexpected finding, we have identified six components of the postsplicing exon-junction complex (EJC) involved in mRNA export and nonsense-mediated decay (NMD)/mRNA surveillance. This association was confirmed by GFP-fusion protein localization. These results raise the possibility that in plants, nucleoli may have additional functions in mRNA export or surveillance. PMID:15496452

  13. Impact of ontology evolution on functional analyses.

    PubMed

    Groß, Anika; Hartung, Michael; Prüfer, Kay; Kelso, Janet; Rahm, Erhard

    2012-10-15

    Ontologies are used in the annotation and analysis of biological data. As knowledge accumulates, ontologies and annotation undergo constant modifications to reflect this new knowledge. These modifications may influence the results of statistical applications such as functional enrichment analyses that describe experimental data in terms of ontological groupings. Here, we investigate to what degree modifications of the Gene Ontology (GO) impact these statistical analyses for both experimental and simulated data. The analysis is based on new measures for the stability of result sets and considers different ontology and annotation changes. Our results show that past changes in the GO are non-uniformly distributed over different branches of the ontology. Considering the semantic relatedness of significant categories in analysis results allows a more realistic stability assessment for functional enrichment studies. We observe that the results of term-enrichment analyses tend to be surprisingly stable despite changes in ontology and annotation.

  14. Pyrcca: Regularized Kernel Canonical Correlation Analysis in Python and Its Applications to Neuroimaging.

    PubMed

    Bilenko, Natalia Y; Gallant, Jack L

    2016-01-01

    In this article we introduce Pyrcca, an open-source Python package for performing canonical correlation analysis (CCA). CCA is a multivariate analysis method for identifying relationships between sets of variables. Pyrcca supports CCA with or without regularization, and with or without linear, polynomial, or Gaussian kernelization. We first use an abstract example to describe Pyrcca functionality. We then demonstrate how Pyrcca can be used to analyze neuroimaging data. Specifically, we use Pyrcca to implement cross-subject comparison in a natural movie functional magnetic resonance imaging (fMRI) experiment by finding a data-driven set of functional response patterns that are similar across individuals. We validate this cross-subject comparison method in Pyrcca by predicting responses to novel natural movies across subjects. Finally, we show how Pyrcca can reveal retinotopic organization in brain responses to natural movies without the need for an explicit model.

  15. Pyrcca: Regularized Kernel Canonical Correlation Analysis in Python and Its Applications to Neuroimaging

    PubMed Central

    Bilenko, Natalia Y.; Gallant, Jack L.

    2016-01-01

    In this article we introduce Pyrcca, an open-source Python package for performing canonical correlation analysis (CCA). CCA is a multivariate analysis method for identifying relationships between sets of variables. Pyrcca supports CCA with or without regularization, and with or without linear, polynomial, or Gaussian kernelization. We first use an abstract example to describe Pyrcca functionality. We then demonstrate how Pyrcca can be used to analyze neuroimaging data. Specifically, we use Pyrcca to implement cross-subject comparison in a natural movie functional magnetic resonance imaging (fMRI) experiment by finding a data-driven set of functional response patterns that are similar across individuals. We validate this cross-subject comparison method in Pyrcca by predicting responses to novel natural movies across subjects. Finally, we show how Pyrcca can reveal retinotopic organization in brain responses to natural movies without the need for an explicit model. PMID:27920675

  16. Morphological analysis of dendrites and spines by hybridization of ridge detection with twin support vector machine.

    PubMed

    Wang, Shuihua; Chen, Mengmeng; Li, Yang; Shao, Ying; Zhang, Yudong; Du, Sidan; Wu, Jane

    2016-01-01

    Dendritic spines are described as neuronal protrusions. The morphology of dendritic spines and dendrites has a strong relationship to its function, as well as playing an important role in understanding brain function. Quantitative analysis of dendrites and dendritic spines is essential to an understanding of the formation and function of the nervous system. However, highly efficient tools for the quantitative analysis of dendrites and dendritic spines are currently undeveloped. In this paper we propose a novel three-step cascaded algorithm-RTSVM- which is composed of ridge detection as the curvature structure identifier for backbone extraction, boundary location based on differences in density, the Hu moment as features and Twin Support Vector Machine (TSVM) classifiers for spine classification. Our data demonstrates that this newly developed algorithm has performed better than other available techniques used to detect accuracy and false alarm rates. This algorithm will be used effectively in neuroscience research.

  17. Perceived Self-Efficacy: A Concept Analysis for Symptom Management in Patients With Cancer
.

    PubMed

    White, Lynn L; Cohen, Marlene Z; Berger, Ann M; Kupzyk, Kevin A; Swore-Fletcher, Barbara A; Bierman, Philip J

    2017-12-01

    Perceived self-efficacy (PSE) for symptom management plays a key role in outcomes for patients with cancer, such as quality of life, functional status, symptom distress, and healthcare use. Definition of the concept is necessary for use in research and to guide the development of interventions to facilitate PSE for symptom management in patients with cancer.
. This analysis will describe the concept of PSE for symptom management in patients with cancer.
. A database search was performed for related publications from 2006-2016. Landmark publications published prior to 2006 that informed the concept analysis were included.
. Greater PSE for symptom management predicts improved performance outcomes, including functional health status, cognitive function, and disease status. Clarification of the concept of PSE for symptom management will accelerate the progress of self-management research and allow for comparison of research data and intervention development.

  18. Management Auditing. Evaluation of the Marine Corps Task Analysis Program. Technical Report No. 5.

    ERIC Educational Resources Information Center

    Hemphill, John M., Jr.; Yoder, Dale

    The management audit is described for possible application as an extension of the mission of the Office of Manpower Utilization (OMU) of the U.S. Marine Corps. The present mission of OMU is viewed as a manpower research program to conduct task analysis of Marine Corps occupational fields. Purpose of the analyses is to improve the functional areas…

  19. Forest landscape analysis and design: a process for developing and implementing land management objectives for landscape patterns.

    Treesearch

    Nancy Diaz; Dean Apostol

    1992-01-01

    This publication presents a Landscape Design and Analysis Process, along with some simple methods and tools for describing landscapes and their function. The information is qualitative in nature and highlights basic concepts, but does not address landscape ecology in great depth. Readers are encouraged to consult the list of selected references in Chapter 2 if they...

  20. Instruments and techniques for the analysis of wheelchair propulsion and upper extremity involvement in patients with spinal cord injuries: current concept review

    PubMed Central

    Dellabiancia, Fabio; Porcellini, Giuseppe; Merolla, Giovanni

    2013-01-01

    Summary The correct functionality of the upper limbs is an essential condition for the autonomy of people with disabilities, especially for those in wheelchair. In this review we focused on the biomechanics of wheelchair propulsion and we described the instrumental analysis of techniques for the acquisition of wheelchair propulsion. PMID:24367774

  1. Mapping the Strategic Thinking of Public Relations Managers in a Crisis Situation: An Illustrative Example Using Conjoint Analysis.

    ERIC Educational Resources Information Center

    Bronn, Peggy Simcic; Olson, Erik L.

    1999-01-01

    Illustrates the operationalization of the conjoint analysis multivariate technique for the study of the public relations function within strategic decision making in a crisis situation. Finds that what the theory describes as the strategic way of handling a crisis is also the way each of the managers who were evaluated would prefer to conduct…

  2. Optical transfer function in corneal topography for clinical contrast sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Bende, Thomas; Jean, Benedikt J.; Oltrup, Theo

    2000-06-01

    Customized ablation aiming to optimize visual acuity in refractive surgery requires objective data on corneal surface, like the contrast sensitivity. Fast ray tracing, using the high resolution 3-D elevation data in conjunction with Snell's law describe the diffraction of the incident rays and the resulting image on a 'virtual retina.' A retroprojection leads to a 'surface quality map.' For objective contrast sensitivity measurement a sinus (or cos) wave of different frequencies is used for a calculated projection in analogy to the clinical contrast sensitivity charts. The projection on the individual cornea surface is analyzed for the Modular Transfer Function (MTF) and the Phase Shift Function (PSF) as a function of frequencies. PSF, not yet clinically used, is a parameter to determine even minimal corneal tilt. The resulting corneal aberration map (CAM) as described here and applied to a 4.5 D PRK (OZD equals 6.5 mm) reveals that the area of minimal aberration measures only 4.2 mm. The CAM can likewise be used to describe the 'quality' of a laser system's ablation pattern based upon the area of minimal optical aberrations. The CAM only describes surface aberration with high resolution, an advantage over wave front sensing which measures all accumulated optical aberrations including the changing ones of the lens during accommodation and the transient ones due to lens aging and early cataract formation.

  3. Group Process: A Systematic Analysis.

    ERIC Educational Resources Information Center

    Roark, Albert E.; Radl, Myrna C.

    1984-01-01

    Identifies components of group process and describes leader functions. Discusses personal elements, focus of interaction/psychological distance, group development, content, quality of interaction, and self-reflective/meaning attribution, illustrated by a case study of a group of persons (N=5) arrested for drunk driving. (JAC)

  4. Drosophila Melanogaster as an Experimental Organism.

    ERIC Educational Resources Information Center

    Rubin, Gerald M.

    1988-01-01

    Discusses the role of the fruit fly in genetics research requiring a multidisciplinary approach. Describes embryological and genetic methods used in the experimental analysis of this organism. Outlines the use of Drosophila in the study of the development and function of the nervous system. (RT)

  5. [Application of the elliptic fourier functions to the description of avian egg shape].

    PubMed

    Ávila, Dennis Denis

    2014-12-01

    Egg shape is difficult to quantify due to the lack of an exact formula to describe its geometry. Here I described a simple algorithm to characterize and compare egg shapes using Fourier functions. These functions can delineate any closed contour and had been previously applied to describe several biological objects. I described, step by step, the process of data acquisition, processing and the use of the SHAPE software to extract function coefficients in a study case. I compared egg shapes in three birds' species representing different reproductive strategies: Cuban Parakeet (Aratinga euops), Royal Tern (Thalasseus maximus) and Cuban Blackbird (Dives atroviolaceus). Using 73 digital pictures of eggs kept in Cuban scientific collections, I calculated Fourier descriptors with 4, 6, 8, 16 and 20 harmonics. Descriptors were reduced by a Principal Component Analysis and the scores of the eigen-values that account for 90% of variance were used in a Lineal Discriminant Function to analyze the possibility to differentiate eggs according to its shapes. Using four harmonics, the first five component accounted for 97% of shape variances; more harmonics diluted the variance increasing to eight the number of components needed to explain most of the variation. Convex polygons in the discriminant space showed a clear separation between species, allowing trustful discrimination (classification errors between 7-15%). Misclassifications were related to specific egg shape variability between species. In the study case, A. euops eggs were perfectly classified, but for the other species, errors ranged from 5 to 29% of misclassifications, in relation to the numbers or harmonics and components used. The proposed algorithm, despite its apparent mathematical complexity, showed many advantages to describe eggs shape allowing a deeper understanding of factors related to this variable.

  6. Cognition and procedure representational requirements for predictive human performance models

    NASA Technical Reports Server (NTRS)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods including procedural backtracking with concurrent search, temporal reasoning, and constraint checking for partial ordering of procedures. Finally, the representation is being linked to models of human decision making processes that include heuristic, propositional and prescriptive judgement models that are sensitive to the procedural content in which the valuative functions are being performed.

  7. Intelligent neural network and fuzzy logic control of industrial and power systems

    NASA Astrophysics Data System (ADS)

    Kuljaca, Ognjen

    The main role played by neural network and fuzzy logic intelligent control algorithms today is to identify and compensate unknown nonlinear system dynamics. There are a number of methods developed, but often the stability analysis of neural network and fuzzy control systems was not provided. This work will meet those problems for the several algorithms. Some more complicated control algorithms included backstepping and adaptive critics will be designed. Nonlinear fuzzy control with nonadaptive fuzzy controllers is also analyzed. An experimental method for determining describing function of SISO fuzzy controller is given. The adaptive neural network tracking controller for an autonomous underwater vehicle is analyzed. A novel stability proof is provided. The implementation of the backstepping neural network controller for the coupled motor drives is described. Analysis and synthesis of adaptive critic neural network control is also provided in the work. Novel tuning laws for the system with action generating neural network and adaptive fuzzy critic are given. Stability proofs are derived for all those control methods. It is shown how these control algorithms and approaches can be used in practical engineering control. Stability proofs are given. Adaptive fuzzy logic control is analyzed. Simulation study is conducted to analyze the behavior of the adaptive fuzzy system on the different environment changes. A novel stability proof for adaptive fuzzy logic systems is given. Also, adaptive elastic fuzzy logic control architecture is described and analyzed. A novel membership function is used for elastic fuzzy logic system. The stability proof is proffered. Adaptive elastic fuzzy logic control is compared with the adaptive nonelastic fuzzy logic control. The work described in this dissertation serves as foundation on which analysis of particular representative industrial systems will be conducted. Also, it gives a good starting point for analysis of learning abilities of adaptive and neural network control systems, as well as for the analysis of the different algorithms such as elastic fuzzy systems.

  8. The use of dwell time cross-correlation functions to study single-ion channel gating kinetics.

    PubMed Central

    Ball, F G; Kerry, C J; Ramsey, R L; Sansom, M S; Usherwood, P N

    1988-01-01

    The derivation of cross-correlation functions from single-channel dwell (open and closed) times is described. Simulation of single-channel data for simple gating models, alongside theoretical treatment, is used to demonstrate the relationship of cross-correlation functions to underlying gating mechanisms. It is shown that time irreversibility of gating kinetics may be revealed in cross-correlation functions. Application of cross-correlation function analysis to data derived from the locust muscle glutamate receptor-channel provides evidence for multiple gateway states and time reversibility of gating. A model for the gating of this channel is used to show the effect of omission of brief channel events on cross-correlation functions. PMID:2462924

  9. Preparation of fosmid libraries and functional metagenomic analysis of microbial community DNA.

    PubMed

    Martínez, Asunción; Osburne, Marcia S

    2013-01-01

    One of the most important challenges in contemporary microbial ecology is to assign a functional role to the large number of novel genes discovered through large-scale sequencing of natural microbial communities that lack similarity to genes of known function. Functional screening of metagenomic libraries, that is, screening environmental DNA clones for the ability to confer an activity of interest to a heterologous bacterial host, is a promising approach for bridging the gap between metagenomic DNA sequencing and functional characterization. Here, we describe methods for isolating environmental DNA and constructing metagenomic fosmid libraries, as well as methods for designing and implementing successful functional screens of such libraries. © 2013 Elsevier Inc. All rights reserved.

  10. Multivariate meta-analysis for non-linear and other multi-parameter associations

    PubMed Central

    Gasparrini, A; Armstrong, B; Kenward, M G

    2012-01-01

    In this paper, we formalize the application of multivariate meta-analysis and meta-regression to synthesize estimates of multi-parameter associations obtained from different studies. This modelling approach extends the standard two-stage analysis used to combine results across different sub-groups or populations. The most straightforward application is for the meta-analysis of non-linear relationships, described for example by regression coefficients of splines or other functions, but the methodology easily generalizes to any setting where complex associations are described by multiple correlated parameters. The modelling framework of multivariate meta-analysis is implemented in the package mvmeta within the statistical environment R. As an illustrative example, we propose a two-stage analysis for investigating the non-linear exposure–response relationship between temperature and non-accidental mortality using time-series data from multiple cities. Multivariate meta-analysis represents a useful analytical tool for studying complex associations through a two-stage procedure. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22807043

  11. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    NASA Technical Reports Server (NTRS)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-01-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  12. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    NASA Astrophysics Data System (ADS)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-11-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  13. Organizational Perspectives on Rapid Response Team Structure, Function, and Cost: A Qualitative Study.

    PubMed

    Smith, Patricia L; McSweeney, Jean

    Understanding how an organization determines structure and function of a rapid response team (RRT), as well as cost evaluation and implications, can provide foundational knowledge to guide decisions about RRTs. The objectives were to (1) identify influencing factors in organizational development of RRT structure and function and (2) describe evaluation of RRT costs. Using a qualitative, ethnographic design, nurse executives and experts in 15 moderate-size hospitals were interviewed to explore their decision-making processes in determining RRT structure and function. Face-to-face interviews were audio recorded and transcribed verbatim and verified for accurateness. Using content analysis and constant comparison, interview data were analyzed. Demographic data were analyzed using descriptive statistics. The sample included 27 participants from 15 hospitals in 5 south-central states. They described a variety of RRT responders and functions, with the majority of hospitals having a critical care charge nurse attending all RRT calls for assistance. Others described a designated RRT nurse with primary RRT duties as responder to all RRT calls. Themes of RRT development from the data included influencers, decision processes, and thoughts about cost. It is important to understand how hospitals determine optimal structure and function to enhance support of quality nursing care. Determining the impact of an RRT on costs and benefits is vital in balancing patient safety and limited resources. Future research should focus on clarifying differences between team structure and function in outcomes as well as the most effective means to estimate costs and benefits.

  14. Using the Saccharomyces Genome Database (SGD) for analysis of genomic information

    PubMed Central

    Skrzypek, Marek S.; Hirschman, Jodi

    2011-01-01

    Analysis of genomic data requires access to software tools that place the sequence-derived information in the context of biology. The Saccharomyces Genome Database (SGD) integrates functional information about budding yeast genes and their products with a set of analysis tools that facilitate exploring their biological details. This unit describes how the various types of functional data available at SGD can be searched, retrieved, and analyzed. Starting with the guided tour of the SGD Home page and Locus Summary page, this unit highlights how to retrieve data using YeastMine, how to visualize genomic information with GBrowse, how to explore gene expression patterns with SPELL, and how to use Gene Ontology tools to characterize large-scale datasets. PMID:21901739

  15. The function of prehistoric lithic tools: a combined study of use-wear analysis and FTIR microspectroscopy.

    PubMed

    Nunziante Cesaro, Stella; Lemorini, Cristina

    2012-02-01

    The application of combined use-wear analysis and FTIR micro spectroscopy for the investigation of the flint and obsidian tools from the archaeological sites of Masseria Candelaro (Foggia, Italy) and Sant'Anna di Oria (Brindisi, Italy) aiming to clarify their functional use is described. The tools excavated in the former site showed in a very high percentage spectroscopically detectable residues on their working edges. The identification of micro deposits is based on comparison with a great number of replicas studied in the same experimental conditions. FTIR data confirmed in almost all cases the use-wear analysis suggestions and added details about the material processed and about the working procedures. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. TraceContract: A Scala DSL for Trace Analysis

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Havelund, Klaus

    2011-01-01

    In this paper we describe TRACECONTRACT, an API for trace analysis, implemented in the SCALA programming language. We argue that for certain forms of trace analysis the best weapon is a high level programming language augmented with constructs for temporal reasoning. A trace is a sequence of events, which may for example be generated by a running program, instrumented appropriately to generate events. The API supports writing properties in a notation that combines an advanced form of data parameterized state machines with temporal logic. The implementation utilizes SCALA's support for defining internal Domain Specific Languages (DSLs). Furthermore SCALA's combination of object oriented and functional programming features, including partial functions and pattern matching, makes it an ideal host language for such an API.

  17. Comparison of analysis and flight test data for a drone aircraft with active flutter suppression

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Pototzky, A. S.

    1981-01-01

    A drone aircraft equipped with an active flutter suppression system is considered with emphasis on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are given for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. The mathematical models are included and existing analytical techniques are described as well as an alternative analytical technique for obtaining closed-loop results.

  18. A user`s guide to LUGSAN II. A computer program to calculate and archive lug and sway brace loads for aircraft-carried stores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, W.N.

    1998-03-01

    LUG and Sway brace ANalysis (LUGSAN) II is an analysis and database computer program that is designed to calculate store lug and sway brace loads for aircraft captive carriage. LUGSAN II combines the rigid body dynamics code, SWAY85, with a Macintosh Hypercard database to function both as an analysis and archival system. This report describes the LUGSAN II application program, which operates on the Macintosh System (Hypercard 2.2 or later) and includes function descriptions, layout examples, and sample sessions. Although this report is primarily a user`s manual, a brief overview of the LUGSAN II computer code is included with suggestedmore » resources for programmers.« less

  19. A stowing and deployment strategy for large membrane space systems on the example of Gossamer-1

    NASA Astrophysics Data System (ADS)

    Seefeldt, Patric

    2017-09-01

    Deployment systems for innovative space applications such as solar sails require a technique for a controlled and autonomous deployment in space. The deployment process has a strong impact on the mechanism and structural design and sizing. On the example of the design implemented in the Gossamer-1 project of the German Aerospace Center (DLR), such a stowing and deployment process is analyzed. It is based on a combination of zig-zag folding and coiling of triangular sail segments spanned between crossed booms. The deployment geometry and forces introduced by the mechanism considered are explored in order to reveal how the loads are transferred through the membranes to structural components such as the booms. The folding geometry and force progressions are described by function compositions of an inverse trigonometric function with the considered trigonometric function itself. If these functions are evaluated over several periods of the trigonometric function, a non-smooth oscillating curve occurs. Depending on the trigonometric function, these are often vividly described as zig-zag or sawtooth functions. The developed functions are applied to the Gossamer-1 design. The deployment geometry reveals a tendency that the loads are transferred along the catheti of the sail segments and therefore mainly along the boom axes. The load introduced by the spool deployment mechanism is described. By combining the deployment geometry with that load, a prediction of the deployment load progression is achieved. The mathematical description of the stowing and deployment geometry, as well as the forces inflicted by the mechanism provides an understanding of how exactly the membrane deploys and through which edges the deployment forces are transferred. The mathematical analysis also gives an impression of sensitive parameters that could be influenced by manufacturing tolerances or unsymmetrical deployment of the sail segments. While the mathematical model was applied on the design of the Gossamer-1 hardware, it allows an analysis of other geometries. This is of particular interest as Gossamer-1 investigated deployment technology on a relatively small scale of 5m × 5m , while the currently considered solar sail missions require sails that are about one order of magnitude bigger.

  20. Diagnosing and improving functioning in interdisciplinary health care teams.

    PubMed

    Blackmore, Gail; Persaud, D David

    2012-01-01

    Interdisciplinary teams play a key role in the delivery of health care. Team functioning can positively or negatively impact the effective and efficient delivery of health care services as well as the personal well-being of group members. Additionally, teams must be able and willing to work together to achieve team goals within a climate that reflects commitment to team goals, accountability, respect, and trust. Not surprisingly, dysfunctional team functioning can limit the success of interdisciplinary health care teams. The first step in improving dysfunctional team function is to conduct an analysis based on criteria necessary for team success, and this article provides meaningful criteria for doing such an analysis. These are the following: a common team goal, the ability and willingness to work together to achieve team goals, decision making, communication, and team member relationships. High-functioning interdisciplinary teams must exhibit features of good team function in all key domains. If a team functions well in some domains and needs to improve in others, targeted strategies are described that can be used to improve team functioning.

  1. Dissociation and recombination of positive holes in minerals

    NASA Technical Reports Server (NTRS)

    Freund, Friedemann; Batllo, Francois; Freund, Minoru M.

    1990-01-01

    The formation mechanisms are described of positive holes - electronic defects in the O2 sublattice - with attention given to detecting the positive surface charge of minerals with these holes. Charge distribution analysis (CDA) is presented which measures dielectric polarization in an inhomogeneous field. CDA can be applied to the detection of the peroxide/superoxide functionality caused by positive holes on the surface. It is demonstrated with obsidian that the measurements provide data on O(-) mobility as a function of surface-charge carrier density and on O(-) generation as a function of temperature.

  2. Differences between quadratic equations and functions: Indonesian pre-service secondary mathematics teachers’ views

    NASA Astrophysics Data System (ADS)

    Aziz, T. A.; Pramudiani, P.; Purnomo, Y. W.

    2018-01-01

    Difference between quadratic equation and quadratic function as perceived by Indonesian pre-service secondary mathematics teachers (N = 55) who enrolled at one private university in Jakarta City was investigated. Analysis of participants’ written responses and interviews were conducted consecutively. Participants’ written responses highlighted differences between quadratic equation and function by referring to their general terms, main characteristics, processes, and geometrical aspects. However, they showed several obstacles in describing the differences such as inappropriate constraints and improper interpretations. Implications of the study are discussed.

  3. Taxonomic and functional patterns of macrobenthic communities on a high-Arctic shelf: A case study from the Laptev Sea

    NASA Astrophysics Data System (ADS)

    Kokarev, V. N.; Vedenin, A. A.; Basin, A. B.; Azovsky, A. I.

    2017-11-01

    The studies of functional structure of high-Arctic Ecosystems are scarce. We used data on benthic macrofauna from 500-km latitudinal transect in the eastern Laptev Sea, from the Lena delta to the continental shelf break, to describe spatial patterns in species composition, taxonomic and functional structure in relation to environmental factors. Both taxonomy-based approach and Biological Trait analysis yielded similar results and showed general depth-related gradient in benthic diversity and composition. This congruence between taxonomical and functional dimensions of community organization suggests that the same environmental factors (primarily riverine input and regime of sedimentation) have similar effect on both community structure and functioning. BTA also revealed a distinct functional structure of stations situated at the Eastern Lena valley, with dominance of motile, burrowing sub-surface deposit-feeders and absence of sedentary tube-dwelling forms. The overall spatial distribution of benthic assemblages corresponds well to that described there in preceding decades, evidencing the long-term stability of bottom ecosystem. Strong linear relationship between species and traits diversity, however, indicates low functional redundancy, which potentially makes the ecosystem susceptible to a species loss or structural shifts.

  4. Functions and requirements document for interim store solidified high-level and transuranic waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith-Fewell, M.A., Westinghouse Hanford

    1996-05-17

    The functions, requirements, interfaces, and architectures contained within the Functions and Requirements (F{ampersand}R) Document are based on the information currently contained within the TWRS Functions and Requirements database. The database also documents the set of technically defensible functions and requirements associated with the solidified waste interim storage mission.The F{ampersand}R Document provides a snapshot in time of the technical baseline for the project. The F{ampersand}R document is the product of functional analysis, requirements allocation and architectural structure definition. The technical baseline described in this document is traceable to the TWRS function 4.2.4.1, Interim Store Solidified Waste, and its related requirements, architecture,more » and interfaces.« less

  5. A draft annotation and overview of the human genome

    PubMed Central

    Wright, Fred A; Lemon, William J; Zhao, Wei D; Sears, Russell; Zhuo, Degen; Wang, Jian-Ping; Yang, Hee-Yung; Baer, Troy; Stredney, Don; Spitzner, Joe; Stutz, Al; Krahe, Ralf; Yuan, Bo

    2001-01-01

    Background The recent draft assembly of the human genome provides a unified basis for describing genomic structure and function. The draft is sufficiently accurate to provide useful annotation, enabling direct observations of previously inferred biological phenomena. Results We report here a functionally annotated human gene index placed directly on the genome. The index is based on the integration of public transcript, protein, and mapping information, supplemented with computational prediction. We describe numerous global features of the genome and examine the relationship of various genetic maps with the assembly. In addition, initial sequence analysis reveals highly ordered chromosomal landscapes associated with paralogous gene clusters and distinct functional compartments. Finally, these annotation data were synthesized to produce observations of gene density and number that accord well with historical estimates. Such a global approach had previously been described only for chromosomes 21 and 22, which together account for 2.2% of the genome. Conclusions We estimate that the genome contains 65,000-75,000 transcriptional units, with exon sequences comprising 4%. The creation of a comprehensive gene index requires the synthesis of all available computational and experimental evidence. PMID:11516338

  6. Spatial memory tasks in rodents: what do they model?

    PubMed

    Morellini, Fabio

    2013-10-01

    The analysis of spatial learning and memory in rodents is commonly used to investigate the mechanisms underlying certain forms of human cognition and to model their dysfunction in neuropsychiatric and neurodegenerative diseases. Proper interpretation of rodent behavior in terms of spatial memory and as a model of human cognitive functions is only possible if various navigation strategies and factors controlling the performance of the animal in a spatial task are taken into consideration. The aim of this review is to describe the experimental approaches that are being used for the study of spatial memory in rats and mice and the way that they can be interpreted in terms of general memory functions. After an introduction to the classification of memory into various categories and respective underlying neuroanatomical substrates, I explain the concept of spatial memory and its measurement in rats and mice by analysis of their navigation strategies. Subsequently, I describe the most common paradigms for spatial memory assessment with specific focus on methodological issues relevant for the correct interpretation of the results in terms of cognitive function. Finally, I present recent advances in the use of spatial memory tasks to investigate episodic-like memory in mice.

  7. Step 1: Human System Integration (HSI) FY05 Pilot-Technology Interface Requirements for Contingency Management

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document involves definition of technology interface requirements for Contingency Management. This was performed through a review of Contingency Management-related, HSI requirements documents, standards, and recommended practices. Technology concepts in use by the Contingency Management Work Package were considered. Beginning with HSI high-level functional requirements for Contingency Management, and Contingency Management technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of system failures and associated contingency procedures, and (2) the control capability needed by the pilot to obtain system status and procedure information. Fundamentally, these requirements provide the candidate Contingency Management technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Contingency Management operations and functions should interface with the pilot to provide the necessary Contingency Management functionality to the UA-pilot system. Requirements and guidelines for Contingency Management are partitioned into four categories: (1) Health and Status and (2) Contingency Management. Each requirement is stated and is supported with a rationale and associated reference(s).

  8. Step 1:Human System Integration (HSI) FY05 Pilot-Technology Interface Requirements for Collision Avoidance

    NASA Technical Reports Server (NTRS)

    2007-01-01

    This document provides definition of technology human interface requirements for Collision Avoidance (CA). This was performed through a review of CA-related, HSI requirements documents, standards, and recommended practices. Technology concepts in use by the Access 5 CA work package were considered... Beginning with the HSI high-level functional requirement for CA, and CA technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge CA system status, and (2) the control capability needed by the pilot to obtain CA information and affect an avoidance maneuver. Fundamentally, these requirements provide the candidate CA technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how CA operations and functions should interface with the pilot to provide the necessary CA functionality to the UA-pilot system .Requirements and guidelines for CA are partitioned into four categories: (1) General, (2) Alerting, (3) Guidance, and (4) Cockpit Display of Traffic Information. Each requirement is stated and is supported with a rationale and associated reference(s).

  9. Functional metabolomics as a tool to analyze Mediator function and structure in plants.

    PubMed

    Davoine, Celine; Abreu, Ilka N; Khajeh, Khalil; Blomberg, Jeanette; Kidd, Brendan N; Kazan, Kemal; Schenk, Peer M; Gerber, Lorenz; Nilsson, Ove; Moritz, Thomas; Björklund, Stefan

    2017-01-01

    Mediator is a multiprotein transcriptional co-regulator complex composed of four modules; Head, Middle, Tail, and Kinase. It conveys signals from promoter-bound transcriptional regulators to RNA polymerase II and thus plays an essential role in eukaryotic gene regulation. We describe subunit localization and activities of Mediator in Arabidopsis through metabolome and transcriptome analyses from a set of Mediator mutants. Functional metabolomic analysis based on the metabolite profiles of Mediator mutants using multivariate statistical analysis and heat-map visualization shows that different subunit mutants display distinct metabolite profiles, which cluster according to the reported localization of the corresponding subunits in yeast. Based on these results, we suggest localization of previously unassigned plant Mediator subunits to specific modules. We also describe novel roles for individual subunits in development, and demonstrate changes in gene expression patterns and specific metabolite levels in med18 and med25, which can explain their phenotypes. We find that med18 displays levels of phytoalexins normally found in wild type plants only after exposure to pathogens. Our results indicate that different Mediator subunits are involved in specific signaling pathways that control developmental processes and tolerance to pathogen infections.

  10. Step 1: Human System Integration Pilot-Technology Interface Requirements for Weather Management

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document involves definition of technology interface requirements for Hazardous Weather Avoidance. Technology concepts in use by the Access 5 Weather Management Work Package were considered. Beginning with the Human System Integration (HIS) high-level functional requirement for Hazardous Weather Avoidance, and Hazardous Weather Avoidance technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of hazardous weather, and (2) the control capability needed by the pilot to obtain hazardous weather information. Fundamentally, these requirements provide the candidate Hazardous Weather Avoidance technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Hazardous Weather Avoidance operations and functions should interface with the pilot to provide the necessary Weather Management functionality to the UA-pilot system. Requirements and guidelines for Hazardous Weather Avoidance are partitioned into four categories: (1) Planning En Route (2) Encountering Hazardous Weather En Route, (3) Planning to Destination, and (4) Diversion Planning Alternate Airport. Each requirement is stated and is supported with a rationale and associated reference(s).

  11. Analysis of Mission Effectiveness: Modern System Architecture Tools for Project Developers

    DTIC Science & Technology

    2017-12-01

    operator input and scripted instructions to describe low-level flow. Note that the case study in Chapter IV describes one pass through evaluation...capability of the sensors. A constraint on the case study is that each sensor type must cover the entire operations area. Cost is a function of 53...completed. 5. Assessment This case study focuses on the first recursive refinement phase completed in a multi-phase effort to demonstrate the effects

  12. Simulation of multiple scattering in a medium with an anisotropic scattering pattern

    NASA Astrophysics Data System (ADS)

    Kuzmin, V. L.; Val'kov, A. Yu.

    2017-03-01

    Multiple backscattering from layers with various thicknesses, including the case of half-space, is numerically simulated and a comparative analysis is performed for systems with the anisotropy of scattering described by the Henyey-Greenstein and Rayleigh-Gans phase functions. It is shown that the intensity of backscattering depends on the form of the phase function; the difference between the intensities obtained within the two models increases with anisotropy.

  13. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    PubMed

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg.

  14. Pulmonary function and adverse cardiovascular outcomes: Can cardiac function explain the link?

    PubMed

    Burroughs Peña, Melissa S; Dunning, Allison; Schulte, Phillip J; Durheim, Michael T; Kussin, Peter; Checkley, William; Velazquez, Eric J

    2016-12-01

    The complex interaction between pulmonary function, cardiac function and adverse cardiovascular events has only been partially described. We sought to describe the association between pulmonary function with left heart structure and function, all-cause mortality and incident cardiovascular hospitalization. This study is a retrospective analysis of patients evaluated in a single tertiary care medical center. We used multivariable linear regression analyses to examine the relationship between FVC and FEV1 with left ventricular ejection fraction (LVEF), left ventricular internal dimension in systole and diastole (LVIDS, LVIDD) and left atrial diameter, adjusting for baseline characteristics, right ventricular function and lung hyperinflation. We also used Cox proportional hazards models to examine the relationship between FVC and FEV1 with all-cause mortality and cardiac hospitalization. A total of 1807 patients were included in this analysis with a median age of 61 years and 50% were female. Decreased FVC and FEV1 were both associated with decreased LVEF. In individuals with FVC less than 2.75 L, decreased FVC was associated with increased all-cause mortality after adjusting for left and right heart echocardiographic variables (hazard ratio [HR] 0.49, 95% CI 0.29, 0.82, respectively). Decreased FVC was associated with increased cardiac hospitalization after adjusting for left heart size (HR 0.80, 95% CI 0.67, 0.96), even in patients with normal LVEF (HR 0.75, 95% CI 0.57, 0.97). In a tertiary care center reduced pulmonary function was associated with adverse cardiovascular events, a relationship that is not fully explained by left heart remodeling or right heart dysfunction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Quantitative Protein Localization Signatures Reveal an Association between Spatial and Functional Divergences of Proteins

    PubMed Central

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-01-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg. PMID:24603469

  16. Functional connectivity decreases in autism in emotion, self, and face circuits identified by Knowledge-based Enrichment Analysis.

    PubMed

    Cheng, Wei; Rolls, Edmund T; Zhang, Jie; Sheng, Wenbo; Ma, Liang; Wan, Lin; Luo, Qiang; Feng, Jianfeng

    2017-03-01

    A powerful new method is described called Knowledge based functional connectivity Enrichment Analysis (KEA) for interpreting resting state functional connectivity, using circuits that are functionally identified using search terms with the Neurosynth database. The method derives its power by focusing on neural circuits, sets of brain regions that share a common biological function, instead of trying to interpret single functional connectivity links. This provides a novel way of investigating how task- or function-related networks have resting state functional connectivity differences in different psychiatric states, provides a new way to bridge the gap between task and resting-state functional networks, and potentially helps to identify brain networks that might be treated. The method was applied to interpreting functional connectivity differences in autism. Functional connectivity decreases at the network circuit level in 394 patients with autism compared with 473 controls were found in networks involving the orbitofrontal cortex, anterior cingulate cortex, middle temporal gyrus cortex, and the precuneus, in networks that are implicated in the sense of self, face processing, and theory of mind. The decreases were correlated with symptom severity. Copyright © 2017. Published by Elsevier Inc.

  17. Wheel Unloading of Rail Vehicles Due to Track Twist

    DOT National Transportation Integrated Search

    1986-02-01

    An analysis is presented describing the effect that track twist has on the loads carried by the wheels of a rail car. Wheel unloading is determined as a function of the difference in crosslevel between the truck centers of the car. The different vehi...

  18. Computer-Assisted Microscopy in Science Teaching and Research.

    ERIC Educational Resources Information Center

    Radice, Gary P.

    1997-01-01

    Describes a technological approach to teaching the relationships between biological form and function. Computer-assisted image analysis was integrated into a microanatomy course. Students spend less time memorizing and more time observing, measuring, and interpreting, building technical and analytical skills. Appendices list hardware and software…

  19. User's guide to SILVAH: stand analysis, prescription, and management simulator program for hardwood stands of the Alleghenies.

    Treesearch

    David A. Marquis; Richard L. Ernst

    1992-01-01

    Describes the purpose and function of the SILVAH computer program in general terms; provides detailed instructions on use of the program; and provides information on program organization , data formats, and the basis of processing algorithms.

  20. The Fred S. Keller School.

    ERIC Educational Resources Information Center

    Twyman, Janet S.

    1998-01-01

    Describes the Fred S. Keller School, one of several schools operating as a Comprehensive Application of Behavior Analysis to Schooling Program. The school functions as a cybernetic system of education in which the individualized instruction of each student influences the behavior of the entire education community. (CR)

  1. How Much Security Does Your Library Need?

    ERIC Educational Resources Information Center

    Banerjee, Kyle

    2003-01-01

    Explains how to keep library systems healthy and functioning by taking sensible security measures. Examines why hackers would target library systems and how library systems are compromised. Describes tools that can help, including: firewalls; antivirus software; alarms; network analysis tools; and encryption. Identifies several strategies for…

  2. Studying the Therapeutic Process by Observing Clinicians' In-Session Behaviour.

    PubMed

    Montaño-Fidalgo, Montserrat; Ruiz, Elena M; Calero-Elvira, Ana; Froján-Parga, María Xesús

    2015-01-01

    This paper presents a further step in the use and validation of a systematic, functional-analytic method of describing psychologists' verbal behaviour during therapy. We observed recordings from 92 clinical sessions of 19 adults (14 women and 5 men of Caucasian origin, with ages ranging from 19 to 51 years) treated by nine cognitive-behavioural therapists (eight women and one man, Caucasian as well, with ages ranging from 25 to 48 years). The therapists' verbal behaviour was codified and then classified according to its possible functionality. A cluster analysis of the data, followed by a discriminant analysis, showed that the therapists' verbal behaviour tended to aggregate around four types of session differentiated by their clinical objective (assessment, explanation, treatment and consolidation). These results confirm the validity of our method and enable us to further describe clinical phenomena by distinguishing psychologists' classes of clinically relevant activities. Specific learning mechanisms may be responsible for clinical change within each class. These issues should be analysed more closely when explaining therapeutic phenomena and when developing more effective forms of clinical intervention. We described therapists' verbal behaviour in a focused fashion so as to develop new research methods that evaluate psychological work moment by moment. We performed a cluster analysis in order to evaluate how the therapists' verbal behaviour was distributed throughout the intervention. A discriminant analysis gave us further information about the statistical significance and possible nature of the clusters we observed. The therapists' verbal behaviour depended on current clinical objectives and could be classified into four classes of clinically relevant activities: evaluation, explanation, treatment and consolidation. Some of the therapist's verbalizations were more important than others when carrying out these clinically relevant activities. The distribution of the therapists' verbal behaviour across classes may provide us with clues regarding the functionality of their in-session verbal behaviour. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Novel Analytic Methods Needed for Real-Time Continuous Core Body Temperature Data

    PubMed Central

    Hertzberg, Vicki; Mac, Valerie; Elon, Lisa; Mutic, Nathan; Mutic, Abby; Peterman, Katherine; Tovar-Aguilar, J. Antonio; Economos, Jeannie; Flocks, Joan; McCauley, Linda

    2017-01-01

    Affordable measurement of core body temperature, Tc, in a continuous, real-time fashion is now possible. With this advance comes a new data analysis paradigm for occupational epidemiology. We characterize issues arising after obtaining Tc data over 188 workdays for 83 participating farmworkers, a population vulnerable to effects of rising temperatures due to climate change. We describe a novel approach to these data using smoothing and functional data analysis. This approach highlights different data aspects compared to describing Tc at a single time point or summaries of the time course into an indicator function (e.g., did Tc ever exceed 38°C, the threshold limit value for occupational heat exposure). Participants working in ferneries had significantly higher Tc at some point during the workday compared to those working in nurseries, despite a shorter workday for fernery participants. Our results typify the challenges and opportunities in analyzing big data streams from real-time physiologic monitoring. PMID:27756853

  4. G protein-coupled receptor internalization assays in the high-content screening format.

    PubMed

    Haasen, Dorothea; Schnapp, Andreas; Valler, Martin J; Heilker, Ralf

    2006-01-01

    High-content screening (HCS), a combination of fluorescence microscopic imaging and automated image analysis, has become a frequently applied tool to study test compound effects in cellular disease-modeling systems. This chapter describes the measurement of G protein-coupled receptor (GPCR) internalization in the HCS format using a high-throughput, confocal cellular imaging device. GPCRs are the most successful group of therapeutic targets on the pharmaceutical market. Accordingly, the search for compounds that interfere with GPCR function in a specific and selective way is a major focus of the pharmaceutical industry today. This chapter describes methods for the ligand-induced internalization of GPCRs labeled previously with either a fluorophore-conjugated ligand or an antibody directed against an N-terminal tag of the GPCR. Both labeling techniques produce robust assay formats. Complementary to other functional GPCR drug discovery assays, internalization assays enable a pharmacological analysis of test compounds. We conclude that GPCR internalization assays represent a valuable medium/high-throughput screening format to determine the cellular activity of GPCR ligands.

  5. Measuring the spatial resolution of an optical system in an undergraduate optics laboratory

    NASA Astrophysics Data System (ADS)

    Leung, Calvin; Donnelly, T. D.

    2017-06-01

    Two methods of quantifying the spatial resolution of a camera are described, performed, and compared, with the objective of designing an imaging-system experiment for students in an undergraduate optics laboratory. With the goal of characterizing the resolution of a typical digital single-lens reflex (DSLR) camera, we motivate, introduce, and show agreement between traditional test-target contrast measurements and the technique of using Fourier analysis to obtain the modulation transfer function (MTF). The advantages and drawbacks of each method are compared. Finally, we explore the rich optical physics at work in the camera system by calculating the MTF as a function of wavelength and f-number. For example, we find that the Canon 40D demonstrates better spatial resolution at short wavelengths, in accordance with scalar diffraction theory, but is not diffraction-limited, being significantly affected by spherical aberration. The experiment and data analysis routines described here can be built and written in an undergraduate optics lab setting.

  6. Uniform, optimal signal processing of mapped deep-sequencing data.

    PubMed

    Kumar, Vibhor; Muratani, Masafumi; Rayan, Nirmala Arul; Kraus, Petra; Lufkin, Thomas; Ng, Huck Hui; Prabhakar, Shyam

    2013-07-01

    Despite their apparent diversity, many problems in the analysis of high-throughput sequencing data are merely special cases of two general problems, signal detection and signal estimation. Here we adapt formally optimal solutions from signal processing theory to analyze signals of DNA sequence reads mapped to a genome. We describe DFilter, a detection algorithm that identifies regulatory features in ChIP-seq, DNase-seq and FAIRE-seq data more accurately than assay-specific algorithms. We also describe EFilter, an estimation algorithm that accurately predicts mRNA levels from as few as 1-2 histone profiles (R ∼0.9). Notably, the presence of regulatory motifs in promoters correlates more with histone modifications than with mRNA levels, suggesting that histone profiles are more predictive of cis-regulatory mechanisms. We show by applying DFilter and EFilter to embryonic forebrain ChIP-seq data that regulatory protein identification and functional annotation are feasible despite tissue heterogeneity. The mathematical formalism underlying our tools facilitates integrative analysis of data from virtually any sequencing-based functional profile.

  7. Novel Analytic Methods Needed for Real-Time Continuous Core Body Temperature Data.

    PubMed

    Hertzberg, Vicki; Mac, Valerie; Elon, Lisa; Mutic, Nathan; Mutic, Abby; Peterman, Katherine; Tovar-Aguilar, J Antonio; Economos, Eugenia; Flocks, Joan; McCauley, Linda

    2016-10-18

    Affordable measurement of core body temperature (T c ) in a continuous, real-time fashion is now possible. With this advance comes a new data analysis paradigm for occupational epidemiology. We characterize issues arising after obtaining T c data over 188 workdays for 83 participating farmworkers, a population vulnerable to effects of rising temperatures due to climate change. We describe a novel approach to these data using smoothing and functional data analysis. This approach highlights different data aspects compared with describing T c at a single time point or summaries of the time course into an indicator function (e.g., did T c ever exceed 38 °C, the threshold limit value for occupational heat exposure). Participants working in ferneries had significantly higher T c at some point during the workday compared with those working in nurseries, despite a shorter workday for fernery participants. Our results typify the challenges and opportunities in analyzing big data streams from real-time physiologic monitoring. © The Author(s) 2016.

  8. Software analysis in the semantic web

    NASA Astrophysics Data System (ADS)

    Taylor, Joshua; Hall, Robert T.

    2013-05-01

    Many approaches in software analysis, particularly dynamic malware analyis, benefit greatly from the use of linked data and other Semantic Web technology. In this paper, we describe AIS, Inc.'s Semantic Extractor (SemEx) component from the Malware Analysis and Attribution through Genetic Information (MAAGI) effort, funded under DARPA's Cyber Genome program. The SemEx generates OWL-based semantic models of high and low level behaviors in malware samples from system call traces generated by AIS's introspective hypervisor, IntroVirtTM. Within MAAGI, these semantic models were used by modules that cluster malware samples by functionality, and construct "genealogical" malware lineages. Herein, we describe the design, implementation, and use of the SemEx, as well as the C2DB, an OWL ontology used for representing software behavior and cyber-environments.

  9. An Optical Method for the In-Vivo Characterization of the Biomechanical Response of the Right Ventricle.

    PubMed

    Soltani, A; Lahti, J; Järvelä, K; Curtze, S; Laurikka, J; Hokka, M; Kuokkala, V-T

    2018-05-01

    The intraoperative in-vivo mechanical function of the left ventricle has been studied thoroughly using echocardiography in the past. However, due to technical and anatomical issues, the ultrasound technology cannot easily be focused on the right side of the heart during open-heart surgery, and the function of the right ventricle during the intervention remains largely unexplored. We used optical imaging and digital image correlation for the characterization of the right ventricle motion and deformation during open-heart surgery. This work is a pilot study focusing on one patient only with the aim of establishing the framework for long term research. These experiments show that optical imaging and the analysis of the images can be used to obtain similar parameters, and partly at higher accuracy, for describing the mechanical functioning of the heart as the ultrasound technology. This work describes the optical imaging based method to characterize the mechanical response of the heart in-vivo, and offers new insight into the mechanical function of the right ventricle.

  10. Propagation of Computational Uncertainty Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2007-01-01

    This paper describes the use of formally designed experiments to aid in the error analysis of a computational experiment. A method is described by which the underlying code is approximated with relatively low-order polynomial graduating functions represented by truncated Taylor series approximations to the true underlying response function. A resource-minimal approach is outlined by which such graduating functions can be estimated from a minimum number of case runs of the underlying computational code. Certain practical considerations are discussed, including ways and means of coping with high-order response functions. The distributional properties of prediction residuals are presented and discussed. A practical method is presented for quantifying that component of the prediction uncertainty of a computational code that can be attributed to imperfect knowledge of independent variable levels. This method is illustrated with a recent assessment of uncertainty in computational estimates of Space Shuttle thermal and structural reentry loads attributable to ice and foam debris impact on ascent.

  11. Preisach modeling of temperature-dependent ferroelectric response of piezoceramics at sub-switching regime

    NASA Astrophysics Data System (ADS)

    Ochoa, Diego Alejandro; García, Jose Eduardo

    2016-04-01

    The Preisach model is a classical method for describing nonlinear behavior in hysteretic systems. According to this model, a hysteretic system contains a collection of simple bistable units which are characterized by an internal field and a coercive field. This set of bistable units exhibits a statistical distribution that depends on these fields as parameters. Thus, nonlinear response depends on the specific distribution function associated with the material. This model is satisfactorily used in this work to describe the temperature-dependent ferroelectric response in PZT- and KNN-based piezoceramics. A distribution function expanded in Maclaurin series considering only the first terms in the internal field and the coercive field is proposed. Changes in coefficient relations of a single distribution function allow us to explain the complex temperature dependence of hard piezoceramic behavior. A similar analysis based on the same form of the distribution function shows that the KNL-NTS properties soften around its orthorhombic to tetragonal phase transition.

  12. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  13. FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data

    PubMed Central

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages. PMID:21253357

  14. Directional pair distribution function for diffraction line profile analysis of atomistic models

    PubMed Central

    Leonardi, Alberto; Leoni, Matteo; Scardi, Paolo

    2013-01-01

    The concept of the directional pair distribution function is proposed to describe line broadening effects in powder patterns calculated from atomistic models of nano-polycrystalline microstructures. The approach provides at the same time a description of the size effect for domains of any shape and a detailed explanation of the strain effect caused by the local atomic displacement. The latter is discussed in terms of different strain types, also accounting for strain field anisotropy and grain boundary effects. The results can in addition be directly read in terms of traditional line profile analysis, such as that based on the Warren–Averbach method. PMID:23396818

  15. Isolation of Mouse Hair Follicle Bulge Stem Cells and Their Functional Analysis in a Reconstitution Assay.

    PubMed

    Zheng, Ying; Hsieh, Jen-Chih; Escandon, Julia; Cotsarelis, George

    2016-01-01

    The hair follicle (HF) is a dynamic structure readily accessible within the skin, and contains various pools of stem cells that have a broad regenerative potential during normal homeostasis and in response to injury. Recent discoveries demonstrating the multipotent capabilities of hair follicle stem cells and the easy access to skin tissue make the HF an attractive source for isolating stem cells and their subsequent application in tissue engineering and regenerative medicine. Here, we describe the isolation and purification of hair follicle bulge stem cells from mouse skin, and hair reconstitution assays that allows the functional analysis of multipotent stem cells.

  16. The Multivariate Temporal Response Function (mTRF) Toolbox: A MATLAB Toolbox for Relating Neural Signals to Continuous Stimuli

    PubMed Central

    Crosse, Michael J.; Di Liberto, Giovanni M.; Bednar, Adam; Lalor, Edmund C.

    2016-01-01

    Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter—often referred to as a temporal response function—that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application. PMID:27965557

  17. MetAMOS: a modular and open source metagenomic assembly and analysis pipeline

    PubMed Central

    2013-01-01

    We describe MetAMOS, an open source and modular metagenomic assembly and analysis pipeline. MetAMOS represents an important step towards fully automated metagenomic analysis, starting with next-generation sequencing reads and producing genomic scaffolds, open-reading frames and taxonomic or functional annotations. MetAMOS can aid in reducing assembly errors, commonly encountered when assembling metagenomic samples, and improves taxonomic assignment accuracy while also reducing computational cost. MetAMOS can be downloaded from: https://github.com/treangen/MetAMOS. PMID:23320958

  18. Representing energy efficiency diagnosis strategies in cognitive work analysis.

    PubMed

    Hilliard, Antony; Jamieson, Greg A

    2017-03-01

    This article describes challenges encountered in applying Jens Rasmussen's Cognitive Work Analysis (CWA) framework to the practice of energy efficiency Monitoring & Targeting (M&T). Eight theoretic issues encountered in the analysis are described with respect to Rasmussen's work and the modeling solutions we adopted. We grappled with how to usefully apply Work Domain Analysis (WDA) to analyze categories of domains with secondary purposes and no ideal grain of decomposition. This difficulty encouraged us to pursue Control Task (ConTA) and Strategies (StrA) analysis, which are under-explored as bases for interface design. In ConTA we found M&T was best represented by two interlinked work functions; one controlling energy, the other maintaining knowledge representations. From StrA, we identified a popular representation-dependent strategy and inferred information required to diagnose faults in system performance and knowledge representation. This article presents and discusses excerpts from our analysis, and outlines their application to diagnosis support tools. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Pitfalls in Persuasion: How Do Users Experience Persuasive Techniques in a Web Service?

    NASA Astrophysics Data System (ADS)

    Segerståhl, Katarina; Kotro, Tanja; Väänänen-Vainio-Mattila, Kaisa

    Persuasive technologies are designed by utilizing a variety of interactive techniques that are believed to promote target behaviors. This paper describes a field study in which the aim was to discover possible pitfalls of persuasion, i.e., situations in which persuasive techniques do not function as expected. The study investigated persuasive functionality of a web service targeting weight loss. A qualitative online questionnaire was distributed through the web service and a total of 291 responses were extracted for interpretative analysis. The Persuasive Systems Design model (PSD) was used for supporting systematic analysis of persuasive functionality. Pitfalls were identified through situations that evoked negative user experiences. The primary pitfalls discovered were associated with manual logging of eating and exercise behaviors, appropriateness of suggestions and source credibility issues related to social facilitation. These pitfalls, when recognized, can be addressed in design by applying functional and facilitative persuasive techniques in meaningful combinations.

  20. The Fate of the Method of 'Paradigms' in Paleobiology.

    PubMed

    Rudwick, Martin J S

    2017-11-02

    An earlier article described the mid-twentieth century origins of the method of "paradigms" in paleobiology, as a way of making testable hypotheses about the functional morphology of extinct organisms. The present article describes the use of "paradigms" through the 1970s and, briefly, to the end of the century. After I had proposed the paradigm method to help interpret the ecological history of brachiopods, my students developed it in relation to that and other invertebrate phyla, notably in Euan Clarkson's analysis of vision in trilobites. David Raup's computer-aided "theoretical morphology" was then combined with my functional or adaptive emphasis, in Adolf Seilacher's tripartite "constructional morphology." Stephen Jay Gould, who had strongly endorsed the method, later switched to criticizing the "adaptationist program" he claimed it embodied. Although the explicit use of paradigms in paleobiology had declined by the end of the century, the method was tacitly subsumed into functional morphology as "biomechanics."

  1. Ground Systems Development Environment (GSDE) interface requirements analysis: Operations scenarios

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Phillips, John

    1991-01-01

    This report is a preliminary assessment of the functional and data interface requirements to the link between the GSDE GS/SPF (Amdahl) and the Space Station Control Center (SSCC) and Space Station Training Facility (SSTF) Integration, Verification, and Test Environments (IVTE's). These interfaces will be involved in ground software development of both the control center and the simulation and training systems. Our understanding of the configuration management (CM) interface and the expected functional characteristics of the Amdahl-IVTE interface is described. A set of assumptions and questions that need to be considered and resolved in order to complete the interface functional and data requirements definitions are presented. A listing of information items defined to describe software configuration items in the GSDE CM system is included. It also includes listings of standard reports of CM information and of CM-related tools in the GSDE.

  2. Decomposition-aggregation stability analysis. [for large scale dynamic systems with application to spinning Skylab control system

    NASA Technical Reports Server (NTRS)

    Siljak, D. D.; Weissenberger, S.; Cuk, S. M.

    1973-01-01

    This report presents the development and description of the decomposition aggregation approach to stability investigations of high dimension mathematical models of dynamic systems. The high dimension vector differential equation describing a large dynamic system is decomposed into a number of lower dimension vector differential equations which represent interconnected subsystems. Then a method is described by which the stability properties of each subsystem are aggregated into a single vector Liapunov function, representing the aggregate system model, consisting of subsystem Liapunov functions as components. A linear vector differential inequality is then formed in terms of the vector Liapunov function. The matrix of the model, which reflects the stability properties of the subsystems and the nature of their interconnections, is analyzed to conclude over-all system stability characteristics. The technique is applied in detail to investigate the stability characteristics of a dynamic model of a hypothetical spinning Skylab.

  3. Estimation of automobile-driver describing function from highway tests using the double steering wheel

    NASA Technical Reports Server (NTRS)

    Delp, P.; Crossman, E. R. F. W.; Szostak, H.

    1972-01-01

    The automobile-driver describing function for lateral position control was estimated for three subjects from frequency response analysis of straight road test results. The measurement procedure employed an instrumented full size sedan with known steering response characteristics, and equipped with a lateral lane position measuring device based on video detection of white stripe lane markings. Forcing functions were inserted through a servo driven double steering wheel coupling the driver to the steering system proper. Random appearing, Gaussian, and transient time functions were used. The quasi-linear models fitted to the random appearing input frequency response characterized the driver as compensating for lateral position error in a proportional, derivative, and integral manner. Similar parameters were fitted to the Gabor transformed frequency response of the driver to transient functions. A fourth term corresponding to response to lateral acceleration was determined by matching the time response histories of the model to the experimental results. The time histories show evidence of pulse-like nonlinear behavior during extended response to step transients which appear as high frequency remnant power.

  4. NASA Project Planning and Control Handbook

    NASA Technical Reports Server (NTRS)

    Moreland, Robert; Claunch, Cathy L.

    2016-01-01

    This handbook provides an overview of the fundamental principles and explains the functions and products that go into project planning and control. The 2010 Interim Results of the NASA Program Planning and Control (PPC) Study identified seven categories of activities for PPC, and those provide the basis for the seven functions described in this handbook. This handbook maps out the interfaces and interactions between PPC functions, as well as their external interfaces. This integration of information and products within and between functions is necessary to form the whole picture of how a project is progressing. The handbook descriptions are meant to facilitate consistent, common, and comprehensive approaches for providing valued analysis, assessment, and evaluation focused on the project level at NASA. The handbook also describes activities in terms of function rather than the job title or the specific person or organization responsible for the activity, which could differ by Center or size of a project. This handbook is primarily guidance for project planning and control: however, the same principles apply to programs and generally apply to institutional planning and control.

  5. The 747 primary flight control systems reliability and maintenance study

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The major operational characteristics of the 747 Primary Flight Control Systems (PFCS) are described. Results of reliability analysis for separate control functions are presented. The analysis makes use of a NASA computer program which calculates reliability of redundant systems. Costs for maintaining the 747 PFCS in airline service are assessed. The reliabilities and cost will provide a baseline for use in trade studies of future flight control system design.

  6. A Bayesian Approach to a Multiple-Group Latent Class-Profile Analysis: The Timing of Drinking Onset and Subsequent Drinking Behaviors among U.S. Adolescents

    ERIC Educational Resources Information Center

    Chung, Hwan; Anthony, James C.

    2013-01-01

    This article presents a multiple-group latent class-profile analysis (LCPA) by taking a Bayesian approach in which a Markov chain Monte Carlo simulation is employed to achieve more robust estimates for latent growth patterns. This article describes and addresses a label-switching problem that involves the LCPA likelihood function, which has…

  7. Heroic struggles, criminals and scientific breakthroughs: ADHD and the medicalization of child behaviour in Australian newsprint media 1999–2009

    PubMed Central

    Harwood, Valerie; Jones, Sandra; Bonney, Andrew; McMahon, Samantha

    2017-01-01

    ABSTRACT There is a dearth of scholarly analysis and critique of the Australian newsprint media’s role in the medicalization of child behaviour. To begin to redress this lack this paper analyses newsprint media’s use of metaphors that re/describe and construct realities of ADHD with a medicalizing effect. The interdisciplinary team used the Factiva TM database to locate and review 453 articles published in Australian national and metropolitan newspapers during the decade 1999–2009. Data analysis involved generating statistical descriptions of the dataset according to attributes such as: date, state, newspaper titles and author names. This was followed by inductive analysis of article content. Content analysis revealed pervasive and striking use of metaphor in newsprint media reporting of ADHD content, especially when describing health professionals, educators, parents and children. This collection of metaphors was striking, and while the metaphors deployed were varied, this diversity seemed underscored by a common functionality that increased the risk that child behaviour was explained using medicalized knowledge. We contend that these metaphors collectively and coherently functioned to simplify and delimit meanings of children’s health and behaviour to favour depictions that medicalize problems of children and childhood. PMID:28532327

  8. MIPS: a database for genomes and protein sequences

    PubMed Central

    Mewes, H. W.; Frishman, D.; Güldener, U.; Mannhaupt, G.; Mayer, K.; Mokrejs, M.; Morgenstern, B.; Münsterkötter, M.; Rudd, S.; Weil, B.

    2002-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF, Neuherberg, Germany) continues to provide genome-related information in a systematic way. MIPS supports both national and European sequencing and functional analysis projects, develops and maintains automatically generated and manually annotated genome-specific databases, develops systematic classification schemes for the functional annotation of protein sequences, and provides tools for the comprehensive analysis of protein sequences. This report updates the information on the yeast genome (CYGD), the Neurospora crassa genome (MNCDB), the databases for the comprehensive set of genomes (PEDANT genomes), the database of annotated human EST clusters (HIB), the database of complete cDNAs from the DHGP (German Human Genome Project), as well as the project specific databases for the GABI (Genome Analysis in Plants) and HNB (Helmholtz–Netzwerk Bioinformatik) networks. The Arabidospsis thaliana database (MATDB), the database of mitochondrial proteins (MITOP) and our contribution to the PIR International Protein Sequence Database have been described elsewhere [Schoof et al. (2002) Nucleic Acids Res., 30, 91–93; Scharfe et al. (2000) Nucleic Acids Res., 28, 155–158; Barker et al. (2001) Nucleic Acids Res., 29, 29–32]. All databases described, the protein analysis tools provided and the detailed descriptions of our projects can be accessed through the MIPS World Wide Web server (http://mips.gsf.de). PMID:11752246

  9. MIPS: a database for genomes and protein sequences.

    PubMed

    Mewes, H W; Frishman, D; Güldener, U; Mannhaupt, G; Mayer, K; Mokrejs, M; Morgenstern, B; Münsterkötter, M; Rudd, S; Weil, B

    2002-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF, Neuherberg, Germany) continues to provide genome-related information in a systematic way. MIPS supports both national and European sequencing and functional analysis projects, develops and maintains automatically generated and manually annotated genome-specific databases, develops systematic classification schemes for the functional annotation of protein sequences, and provides tools for the comprehensive analysis of protein sequences. This report updates the information on the yeast genome (CYGD), the Neurospora crassa genome (MNCDB), the databases for the comprehensive set of genomes (PEDANT genomes), the database of annotated human EST clusters (HIB), the database of complete cDNAs from the DHGP (German Human Genome Project), as well as the project specific databases for the GABI (Genome Analysis in Plants) and HNB (Helmholtz-Netzwerk Bioinformatik) networks. The Arabidospsis thaliana database (MATDB), the database of mitochondrial proteins (MITOP) and our contribution to the PIR International Protein Sequence Database have been described elsewhere [Schoof et al. (2002) Nucleic Acids Res., 30, 91-93; Scharfe et al. (2000) Nucleic Acids Res., 28, 155-158; Barker et al. (2001) Nucleic Acids Res., 29, 29-32]. All databases described, the protein analysis tools provided and the detailed descriptions of our projects can be accessed through the MIPS World Wide Web server (http://mips.gsf.de).

  10. An application programming interface for CellNetAnalyzer.

    PubMed

    Klamt, Steffen; von Kamp, Axel

    2011-08-01

    CellNetAnalyzer (CNA) is a MATLAB toolbox providing computational methods for studying structure and function of metabolic and cellular signaling networks. In order to allow non-experts to use these methods easily, CNA provides GUI-based interactive network maps as a means of parameter input and result visualization. However, with the availability of high-throughput data, there is a need to make CNA's functionality also accessible in batch mode for automatic data processing. Furthermore, as some algorithms of CNA are of general relevance for network analysis it would be desirable if they could be called as sub-routines by other applications. For this purpose, we developed an API (application programming interface) for CNA allowing users (i) to access the content of network models in CNA, (ii) to use CNA's network analysis capabilities independent of the GUI, and (iii) to interact with the GUI to facilitate the development of graphical plugins. Here we describe the organization of network projects in CNA and the application of the new API functions to these projects. This includes the creation of network projects from scratch, loading and saving of projects and scenarios, and the application of the actual analysis methods. Furthermore, API functions for the import/export of metabolic models in SBML format and for accessing the GUI are described. Lastly, two example applications demonstrate the use and versatile applicability of CNA's API. CNA is freely available for academic use and can be downloaded from http://www.mpi-magdeburg.mpg.de/projects/cna/cna.html. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Application of the International Classification of Functioning, Disability and Health (ICF) to people with dysphagia following non-surgical head and neck cancer management.

    PubMed

    Nund, Rebecca L; Scarinci, Nerina A; Cartmill, Bena; Ward, Elizabeth C; Kuipers, Pim; Porceddu, Sandro V

    2014-12-01

    The International Classification of Functioning, Disability, and Health (ICF) is an internationally recognized framework which allows its user to describe the consequences of a health condition on an individual in the context of their environment. With growing recognition that dysphagia can have broad ranging physical and psychosocial impacts, the aim of this paper was to identify the ICF domains and categories that describe the full functional impact of dysphagia following non-surgical head and neck cancer (HNC) management, from the perspective of the person with dysphagia. A secondary analysis was conducted on previously published qualitative study data which explored the lived experiences of dysphagia of 24 individuals with self-reported swallowing difficulties following HNC management. Categories and sub-categories identified by the qualitative analysis were subsequently mapped to the ICF using the established linking rules to develop a set of ICF codes relevant to the impact of dysphagia following HNC management. The 69 categories and sub-categories that had emerged from the qualitative analysis were successfully linked to 52 ICF codes. The distribution of these codes across the ICF framework revealed that the components of Body Functions, Activities and Participation, and Environmental Factors were almost equally represented. The findings confirm that the ICF is a valuable framework for representing the complexity and multifaceted impact of dysphagia following HNC. This list of ICF codes, which reflect the diverse impact of dysphagia associated with HNC on the individual, can be used to guide more holistic assessment and management for this population.

  12. IDENTIFICATION OF NICOTINAMIDE MONONUCLEOTIDE DEAMIDASE OF THE BACTERIAL PYRIDINE NUCLEOTIDE CYCLE REVEALS A NOVEL BROADLY CONSERVED AMIDOHYDROLASE FAMILY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galeazzi, Luca; Bocci, Paolo; Amici, Adolfo

    2011-09-27

    The pyridine nucleotide cycle (PNC) is a network of salvage and recycling routes maintaining homeostasis of NAD(P) cofactor pool in the cell. Nicotinamide mononucleotide (NMN) deamidase (EC 3.5.1.42), one of the key enzymes of the bacterial PNC was originally described in Enterobacteria, but the corresponding gene eluded identification for over 30 years. A genomics-based reconstruction of NAD metabolism across hundreds bacterial species suggested that NMN deamidase reaction is the only possible way of nicotinamide salvage in the marine bacterium Shewanella oneidensis. This prediction was verified via purification of native NMN deamidase from S. oneidensis followed by the identification of themore » respective gene, termed pncC. Enzymatic characterization of the PncC protein, as well as phenotype analysis of deletion mutants, confirmed its proposed biochemical and physiological function in S. oneidensis. Of the three PncC homologs present in E. coli, NMN deamidase activity was confirmed only for the recombinant purified product of the ygaD gene. A comparative analysis at the level of sequence and three dimensional structure, which is available for one of the PncC family member, shows no homology with any previously described amidohydrolases. Multiple alignment analysis of functional and non functional PncC homologs, together with NMN docking experiments, allowed us to tentatively identify the active site area and conserved residues therein. An observed broad phylogenomic distribution of predicted functional PncCs in bacterial kingdom is consistent with a possible role in detoxification of NMN, resulting from NAD utilization by DNA ligase.« less

  13. Mutations of Vasopressin Receptor 2 Including Novel L312S Have Differential Effects on Trafficking.

    PubMed

    Tiulpakov, Anatoly; White, Carl W; Abhayawardana, Rekhati S; See, Heng B; Chan, Audrey S; Seeber, Ruth M; Heng, Julian I; Dedov, Ivan; Pavlos, Nathan J; Pfleger, Kevin D G

    2016-08-01

    Nephrogenic syndrome of inappropriate antidiuresis (NSIAD) is a genetic disease first described in 2 unrelated male infants with severe symptomatic hyponatremia. Despite undetectable arginine vasopressin levels, patients have inappropriately concentrated urine resulting in hyponatremia, hypoosmolality, and natriuresis. Here, we describe and functionally characterize a novel vasopressin type 2 receptor (V2R) gain-of-function mutation. An L312S substitution in the seventh transmembrane domain was identified in a boy presenting with water-induced hyponatremic seizures at the age of 5.8 years. We show that, compared with wild-type V2R, the L312S mutation results in the constitutive production of cAMP, indicative of the gain-of-function NSIAD profile. Interestingly, like the previously described F229V and I130N NSIAD-causing mutants, this appears to both occur in the absence of notable constitutive β-arrestin2 recruitment and can be reduced by the inverse agonist Tolvaptan. In addition, to understand the effect of various V2R substitutions on the full receptor "life-cycle," we have used and further developed a bioluminescence resonance energy transfer intracellular localization assay using multiple localization markers validated with confocal microscopy. This allowed us to characterize differences in the constitutive and ligand-induced localization and trafficking profiles of the novel L312S mutation as well as for previously described V2R gain-of-function mutants (NSIAD; R137C and R137L), loss-of-function mutants (nephrogenic diabetes insipidus; R137H, R181C, and M311V), and a putative silent V266A V2R polymorphism. In doing so, we describe differences in trafficking between unique V2R substitutions, even at the same amino acid position, therefore highlighting the value of full and thorough characterization of receptor function beyond simple signaling pathway analysis.

  14. Analysis of laser shock experiments on precompressed samples using a quartz reference and application to warm dense hydrogen and helium

    DOE PAGES

    Brygoo, Stephanie; Millot, Marius; Loubeyre, Paul; ...

    2015-11-16

    Megabar (1 Mbar = 100 GPa) laser shocks on precompressed samples allow reaching unprecedented high densities and moderately high ~10 3–10 4 K temperatures. We describe in this paper a complete analysis framework for the velocimetry (VISAR) and pyrometry (SOP) data produced in these experiments. Since the precompression increases the initial density of both the sample of interest and the quartz reference for pressure-density, reflectivity, and temperature measurements, we describe analytical corrections based on available experimental data on warm dense silica and density-functional-theory based molecular dynamics computer simulations. Finally, using our improved analysis framework, we report a re-analysis of previouslymore » published data on warm dense hydrogen and helium, compare the newly inferred pressure, density, and temperature data with most advanced equation of state models and provide updated reflectivity values.« less

  15. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  16. New method to incorporate Type B uncertainty into least-squares procedures in radionuclide metrology.

    PubMed

    Han, Jubong; Lee, K B; Lee, Jong-Man; Park, Tae Soon; Oh, J S; Oh, Pil-Jei

    2016-03-01

    We discuss a new method to incorporate Type B uncertainty into least-squares procedures. The new method is based on an extension of the likelihood function from which a conventional least-squares function is derived. The extended likelihood function is the product of the original likelihood function with additional PDFs (Probability Density Functions) that characterize the Type B uncertainties. The PDFs are considered to describe one's incomplete knowledge on correction factors being called nuisance parameters. We use the extended likelihood function to make point and interval estimations of parameters in the basically same way as the least-squares function used in the conventional least-squares method is derived. Since the nuisance parameters are not of interest and should be prevented from appearing in the final result, we eliminate such nuisance parameters by using the profile likelihood. As an example, we present a case study for a linear regression analysis with a common component of Type B uncertainty. In this example we compare the analysis results obtained from using our procedure with those from conventional methods. Copyright © 2015. Published by Elsevier Ltd.

  17. Systematic inference of functional phosphorylation events in yeast metabolism.

    PubMed

    Chen, Yu; Wang, Yonghong; Nielsen, Jens

    2017-07-01

    Protein phosphorylation is a post-translational modification that affects proteins by changing their structure and conformation in a rapid and reversible way, and it is an important mechanism for metabolic regulation in cells. Phosphoproteomics enables high-throughput identification of phosphorylation events on metabolic enzymes, but identifying functional phosphorylation events still requires more detailed biochemical characterization. Therefore, development of computational methods for investigating unknown functions of a large number of phosphorylation events identified by phosphoproteomics has received increased attention. We developed a mathematical framework that describes the relationship between phosphorylation level of a metabolic enzyme and the corresponding flux through the enzyme. Using this framework, it is possible to quantitatively estimate contribution of phosphorylation events to flux changes. We showed that phosphorylation regulation analysis, combined with a systematic workflow and correlation analysis, can be used for inference of functional phosphorylation events in steady and dynamic conditions, respectively. Using this analysis, we assigned functionality to phosphorylation events of 17 metabolic enzymes in the yeast Saccharomyces cerevisiae , among which 10 are novel. Phosphorylation regulation analysis cannot only be extended for inference of other functional post-translational modifications but also be a promising scaffold for multi-omics data integration in systems biology. Matlab codes for flux balance analysis in this study are available in Supplementary material. yhwang@ecust.edu.cn or nielsenj@chalmers.se. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  18. AAC Best Practice Using Automated Language Activity Monitoring.

    ERIC Educational Resources Information Center

    Hill, Katya; Romich, Barry

    This brief paper describes automated language activity monitoring (LAM), an augmentative and alternative communication (AAC) methodology for the collection, editing, and analysis of language data in structured or natural situations with people who have severe communication disorders. The LAM function records each language event (letters, words,…

  19. Diffusion of Super-Gaussian Profiles

    ERIC Educational Resources Information Center

    Rosenberg, C.-J.; Anderson, D.; Desaix, M.; Johannisson, P.; Lisak, M.

    2007-01-01

    The present analysis describes an analytically simple and systematic approximation procedure for modelling the free diffusive spreading of initially super-Gaussian profiles. The approach is based on a self-similar ansatz for the evolution of the diffusion profile, and the parameter functions involved in the modelling are determined by suitable…

  20. Nominal Group as Qualifier to "Someone"

    ERIC Educational Resources Information Center

    Sujatna, Eva Tuckyta Sari; Wahyuni, Sri

    2017-01-01

    The paper titled "Nominal Group as Qualifier to 'Someone'" investigated types of qualifiers which are embedded to the head "someone" in a nominal group. This research was conducted in the light of Systemic Functional Linguistics analysis. The data was analyzed, classified then described using descriptive qualitative method.…

  1. Student Financial Aid Delivery System.

    ERIC Educational Resources Information Center

    O'Neal, John R.; Carpenter, Catharine A.

    1983-01-01

    Ohio University's use of computer programing for the need analysis and internal accounting functions in financial aid is described. A substantial improvement of services resulted, with 6,000-10,000 students and the offices of financial aid, bursar, registration, student records, housing, admissions, and controller assisted in the process. Costs…

  2. The role of ecological dynamics in analysing performance in team sports.

    PubMed

    Vilar, Luís; Araújo, Duarte; Davids, Keith; Button, Chris

    2012-01-01

    Performance analysis is a subdiscipline of sports sciences and one-approach, notational analysis, has been used to objectively audit and describe behaviours of performers during different subphases of play, providing additional information for practitioners to improve future sports performance. Recent criticisms of these methods have suggested the need for a sound theoretical rationale to explain performance behaviours, not just describe them. The aim of this article was to show how ecological dynamics provides a valid theoretical explanation of performance in team sports by explaining the formation of successful and unsuccessful patterns of play, based on symmetry-breaking processes emerging from functional interactions between players and the performance environment. We offer the view that ecological dynamics is an upgrade to more operational methods of performance analysis that merely document statistics of competitive performance. In support of our arguments, we refer to exemplar data on competitive performance in team sports that have revealed functional interpersonal interactions between attackers and defenders, based on variations in the spatial positioning of performers relative to each other in critical performance areas, such as the scoring zones. Implications of this perspective are also considered for practice task design and sport development programmes.

  3. Alignment error envelopes for single particle analysis.

    PubMed

    Jensen, G J

    2001-01-01

    To determine the structure of a biological particle to high resolution by electron microscopy, image averaging is required to combine information from different views and to increase the signal-to-noise ratio. Starting from the number of noiseless views necessary to resolve features of a given size, four general factors are considered that increase the number of images actually needed: (1) the physics of electron scattering introduces shot noise, (2) thermal motion and particle inhomogeneity cause the scattered electrons to describe a mixture of structures, (3) the microscope system fails to usefully record all the information carried by the scattered electrons, and (4) image misalignment leads to information loss through incoherent averaging. The compound effect of factors 2-4 is approximated by the product of envelope functions. The problem of incoherent image averaging is developed in detail through derivation of five envelope functions that account for small errors in 11 "alignment" parameters describing particle location, orientation, defocus, magnification, and beam tilt. The analysis provides target error tolerances for single particle analysis to near-atomic (3.5 A) resolution, and this prospect is shown to depend critically on image quality, defocus determination, and microscope alignment. Copyright 2001 Academic Press.

  4. A half century of scalloping in the work habits of the United States Congress.

    PubMed Central

    Critchfield, Thomas S; Haley, Rebecca; Sabo, Benjamin; Colbert, Jorie; Macropoulis, Georgette

    2003-01-01

    It has been suggested that the work environment of the United States Congress bears similarity to a fixed-interval reinforcement schedule. Consistent with this notion, Weisberg and Waldrop (1972) described a positively accelerating pattern in annual congressional bill production (selected years from 1947 to 1968) that is reminiscent of the scalloped response pattern often attributed to fixed-interval schedules, but their analysis is now dated and does not bear on the functional relations that might yield scalloping. The present study described annual congressional bill production over a period of 52 years and empirically evaluated predictions derived from four hypotheses about the mechanisms that underlie scalloping. Scalloping occurred reliably in every year. The data supported several predictions about congressional productivity based on fixed-interval schedule performance, but did not consistently support any of three alternative accounts. These findings argue for the external validity of schedule-controlled operant behavior as measured in the laboratory. The present analysis also illustrates a largely overlooked role for applied behavior analysis: that of shedding light on the functional properties of behavior in uncontrolled settings of considerable interest to the public. PMID:14768667

  5. Functional linear models to test for differences in prairie wetland hydraulic gradients

    USGS Publications Warehouse

    Greenwood, Mark C.; Sojda, Richard S.; Preston, Todd M.; Swayne, David A.; Yang, Wanhong; Voinov, A.A.; Rizzoli, A.; Filatova, T.

    2010-01-01

    Functional data analysis provides a framework for analyzing multiple time series measured frequently in time, treating each series as a continuous function of time. Functional linear models are used to test for effects on hydraulic gradient functional responses collected from three types of land use in Northeastern Montana at fourteen locations. Penalized regression-splines are used to estimate the underlying continuous functions based on the discretely recorded (over time) gradient measurements. Permutation methods are used to assess the statistical significance of effects. A method for accommodating missing observations in each time series is described. Hydraulic gradients may be an initial and fundamental ecosystem process that responds to climate change. We suggest other potential uses of these methods for detecting evidence of climate change.

  6. Transforming user needs into functional requirements for an antibiotic clinical decision support system: explicating content analysis for system design.

    PubMed

    Bright, T J

    2013-01-01

    Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). THE APPROACH CONSISTED OF FIVE STEPS: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains.

  7. What variables influence the ability of an AFO to improve function and when are they indicated?

    PubMed

    Malas, Bryan S

    2011-05-01

    Children with spina bifida often present with functional deficits of the lower limb associated with neurosegmental lesion levels and require orthotic management. The most used orthosis for children with spina bifida is the ankle-foot orthosis (AFO). The AFO improves ambulation and reduces energy cost while walking. Despite the apparent benefits of using an AFO, limited evidence documents the influence of factors predicting the ability of an AFO to improve function and when they are indicated. These variables include AFO design, footwear, AFO-footwear combination, and data acquisition. When these variables are not adequately considered in clinical decision-making, there is a risk the AFO will be abandoned prematurely or the patient's stability, function, and safety compromised. The purposes of this study are to (1) describe the functional deficits based on lesion levels; (2) identify and describe variables that influence the ability of an AFO to control deformities; and (3) describe what variables are indicated for the AFO to control knee flexion during stance, hyperpronation, and valgus stress at the knee. A selective literature review was undertaken searching MEDLINE and Cochrane databases using terms related to "orthosis" and "spina bifida." Based on previous studies and gait analysis data, suggestions can be made regarding material selection/geometric configuration, sagittal alignment, footplate length, and trim lines of an AFO for reducing knee flexion, hyperpronation, and valgus stress at the knee. Further research is required to determine what variables allow an AFO to improve function.

  8. Variation in trait trade-offs allows differentiation among predefined plant functional types: implications for predictive ecology.

    PubMed

    Verheijen, Lieneke M; Aerts, Rien; Bönisch, Gerhard; Kattge, Jens; Van Bodegom, Peter M

    2016-01-01

    Plant functional types (PFTs) aggregate the variety of plant species into a small number of functionally different classes. We examined to what extent plant traits, which reflect species' functional adaptations, can capture functional differences between predefined PFTs and which traits optimally describe these differences. We applied Gaussian kernel density estimation to determine probability density functions for individual PFTs in an n-dimensional trait space and compared predicted PFTs with observed PFTs. All possible combinations of 1-6 traits from a database with 18 different traits (total of 18 287 species) were tested. A variety of trait sets had approximately similar performance, and 4-5 traits were sufficient to classify up to 85% of the species into PFTs correctly, whereas this was 80% for a bioclimatically defined tree PFT classification. Well-performing trait sets included combinations of correlated traits that are considered functionally redundant within a single plant strategy. This analysis quantitatively demonstrates how structural differences between PFTs are reflected in functional differences described by particular traits. Differentiation between PFTs is possible despite large overlap in plant strategies and traits, showing that PFTs are differently positioned in multidimensional trait space. This study therefore provides the foundation for important applications for predictive ecology. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  9. A protein domain-centric approach for the comparative analysis of human and yeast phenotypically relevant mutations

    PubMed Central

    2013-01-01

    Background The body of disease mutations with known phenotypic relevance continues to increase and is expected to do so even faster with the advent of new experimental techniques such as whole-genome sequencing coupled with disease association studies. However, genomic association studies are limited by the molecular complexity of the phenotype being studied and the population size needed to have adequate statistical power. One way to circumvent this problem, which is critical for the study of rare diseases, is to study the molecular patterns emerging from functional studies of existing disease mutations. Current gene-centric analyses to study mutations in coding regions are limited by their inability to account for the functional modularity of the protein. Previous studies of the functional patterns of known human disease mutations have shown a significant tendency to cluster at protein domain positions, namely position-based domain hotspots of disease mutations. However, the limited number of known disease mutations remains the main factor hindering the advancement of mutation studies at a functional level. In this paper, we address this problem by incorporating mutations known to be disruptive of phenotypes in other species. Focusing on two evolutionarily distant organisms, human and yeast, we describe the first inter-species analysis of mutations of phenotypic relevance at the protein domain level. Results The results of this analysis reveal that phenotypic mutations from yeast cluster at specific positions on protein domains, a characteristic previously revealed to be displayed by human disease mutations. We found over one hundred domain hotspots in yeast with approximately 50% in the exact same domain position as known human disease mutations. Conclusions We describe an analysis using protein domains as a framework for transferring functional information by studying domain hotspots in human and yeast and relating phenotypic changes in yeast to diseases in human. This first-of-a-kind study of phenotypically relevant yeast mutations in relation to human disease mutations demonstrates the utility of a multi-species analysis for advancing the understanding of the relationship between genetic mutations and phenotypic changes at the organismal level. PMID:23819456

  10. The CLAIR model: Extension of Brodmann areas based on brain oscillations and connectivity.

    PubMed

    Başar, Erol; Düzgün, Aysel

    2016-05-01

    Since the beginning of the last century, the localization of brain function has been represented by Brodmann areas, maps of the anatomic organization of the brain. They are used to broadly represent cortical structures with their given sensory-cognitive functions. In recent decades, the analysis of brain oscillations has become important in the correlation of brain functions. Moreover, spectral connectivity can provide further information on the dynamic connectivity between various structures. In addition, brain responses are dynamic in nature and structural localization is almost impossible, according to Luria (1966). Therefore, brain functions are very difficult to localize; hence, a combined analysis of oscillation and event-related coherences is required. In this study, a model termed as "CLAIR" is described to enrich and possibly replace the concept of the Brodmann areas. A CLAIR model with optimum function may take several years to develop, but this study sets out to lay its foundation. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Large-Scale Interaction Profiling of Protein Domains Through Proteomic Peptide-Phage Display Using Custom Peptidomes.

    PubMed

    Seo, Moon-Hyeong; Nim, Satra; Jeon, Jouhyun; Kim, Philip M

    2017-01-01

    Protein-protein interactions are essential to cellular functions and signaling pathways. We recently combined bioinformatics and custom oligonucleotide arrays to construct custom-made peptide-phage libraries for screening peptide-protein interactions, an approach we call proteomic peptide-phage display (ProP-PD). In this chapter, we describe protocols for phage display for the identification of natural peptide binders for a given protein. We finally describe deep sequencing for the analysis of the proteomic peptide-phage display.

  12. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  13. Multilayer motif analysis of brain networks

    NASA Astrophysics Data System (ADS)

    Battiston, Federico; Nicosia, Vincenzo; Chavez, Mario; Latora, Vito

    2017-04-01

    In the last decade, network science has shed new light both on the structural (anatomical) and on the functional (correlations in the activity) connectivity among the different areas of the human brain. The analysis of brain networks has made possible to detect the central areas of a neural system and to identify its building blocks by looking at overabundant small subgraphs, known as motifs. However, network analysis of the brain has so far mainly focused on anatomical and functional networks as separate entities. The recently developed mathematical framework of multi-layer networks allows us to perform an analysis of the human brain where the structural and functional layers are considered together. In this work, we describe how to classify the subgraphs of a multiplex network, and we extend the motif analysis to networks with an arbitrary number of layers. We then extract multi-layer motifs in brain networks of healthy subjects by considering networks with two layers, anatomical and functional, respectively, obtained from diffusion and functional magnetic resonance imaging. Results indicate that subgraphs in which the presence of a physical connection between brain areas (links at the structural layer) coexists with a non-trivial positive correlation in their activities are statistically overabundant. Finally, we investigate the existence of a reinforcement mechanism between the two layers by looking at how the probability to find a link in one layer depends on the intensity of the connection in the other one. Showing that functional connectivity is non-trivially constrained by the underlying anatomical network, our work contributes to a better understanding of the interplay between the structure and function in the human brain.

  14. [Importance of stimulation of the areas involved in the mathematical processing: effects on neurodevelopment].

    PubMed

    Arch-Tirado, Emilio; Lino-González, Ana Luisa; Alfaro-Rodríguez, Alfonso

    2013-01-01

    This paper aims to discuss and analyze the role of mathematics in neurodevelopment, for which discusses the historical, ontogenetic and physiological bases involved. The methodology of this paper is a deductive analysis, describing the use of mathematics in ancient cultures to the specialization of brain regions. Sensory perceptions are useful for the acquisition and development of cortical functions thus sensory stimulations is essential for the maturation of specialized neurologic functions.

  15. The Technologist Function in Fields Related to Radiology: Tasks in Radiation Therapy and Diagnostic Ultrasound. Research Report No. 9; Relating Technologist Tasks in Diagnostic Radiology, Ultrasound and Radiation Therapy. Research Report No. 10.

    ERIC Educational Resources Information Center

    Gilpatrick, Eleanor

    The two research reports included in this document describe the application of the Health Services Mobility Study (HSMS) task analysis method to two technologist functions and examine the interrelationships of these tasks with those in diagnostic radiology. (The HSMS method includes processes for using the data for designing job ladders, for…

  16. Parcellating an individual subject's cortical and subcortical brain structures using snowball sampling of resting-state correlations.

    PubMed

    Wig, Gagan S; Laumann, Timothy O; Cohen, Alexander L; Power, Jonathan D; Nelson, Steven M; Glasser, Matthew F; Miezin, Francis M; Snyder, Abraham Z; Schlaggar, Bradley L; Petersen, Steven E

    2014-08-01

    We describe methods for parcellating an individual subject's cortical and subcortical brain structures using resting-state functional correlations (RSFCs). Inspired by approaches from social network analysis, we first describe the application of snowball sampling on RSFC data (RSFC-Snowballing) to identify the centers of cortical areas, subdivisions of subcortical nuclei, and the cerebellum. RSFC-Snowballing parcellation is then compared with parcellation derived from identifying locations where RSFC maps exhibit abrupt transitions (RSFC-Boundary Mapping). RSFC-Snowballing and RSFC-Boundary Mapping largely complement one another, but also provide unique parcellation information; together, the methods identify independent entities with distinct functional correlations across many cortical and subcortical locations in the brain. RSFC parcellation is relatively reliable within a subject scanned across multiple days, and while the locations of many area centers and boundaries appear to exhibit considerable overlap across subjects, there is also cross-subject variability-reinforcing the motivation to parcellate brains at the level of individuals. Finally, examination of a large meta-analysis of task-evoked functional magnetic resonance imaging data reveals that area centers defined by task-evoked activity exhibit correspondence with area centers defined by RSFC-Snowballing. This observation provides important evidence for the ability of RSFC to parcellate broad expanses of an individual's brain into functionally meaningful units. © The Author 2013. Published by Oxford University Press.

  17. Constructing general partial differential equations using polynomial and neural networks.

    PubMed

    Zjavka, Ladislav; Pedrycz, Witold

    2016-01-01

    Sum fraction terms can approximate multi-variable functions on the basis of discrete observations, replacing a partial differential equation definition with polynomial elementary data relation descriptions. Artificial neural networks commonly transform the weighted sum of inputs to describe overall similarity relationships of trained and new testing input patterns. Differential polynomial neural networks form a new class of neural networks, which construct and solve an unknown general partial differential equation of a function of interest with selected substitution relative terms using non-linear multi-variable composite polynomials. The layers of the network generate simple and composite relative substitution terms whose convergent series combinations can describe partial dependent derivative changes of the input variables. This regression is based on trained generalized partial derivative data relations, decomposed into a multi-layer polynomial network structure. The sigmoidal function, commonly used as a nonlinear activation of artificial neurons, may transform some polynomial items together with the parameters with the aim to improve the polynomial derivative term series ability to approximate complicated periodic functions, as simple low order polynomials are not able to fully make up for the complete cycles. The similarity analysis facilitates substitutions for differential equations or can form dimensional units from data samples to describe real-world problems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Effect of Display Color on Pilot Performance and Describing Functions

    NASA Technical Reports Server (NTRS)

    Chase, Wendell D.

    1997-01-01

    A study has been conducted with the full-spectrum, calligraphic, computer-generated display system to determine the effect of chromatic content of the visual display upon pilot performance during the landing approach maneuver. This study utilizes a new digital chromatic display system, which has previously been shown to improve the perceived fidelity of out-the-window display scenes, and presents the results of an experiment designed to determine the effects of display color content by the measurement of both vertical approach performance and pilot-describing functions. This method was selected to more fully explore the effects of visual color cues used by the pilot. Two types of landing approaches were made: dynamic and frozen range, with either a landing approach scene or a perspective array display. The landing approach scene was presented with either red runway lights and blue taxiway lights or with the colors reversed, and the perspective array with red lights, blue lights, or red and blue lights combined. The vertical performance measures obtained in this experiment indicated that the pilots performed best with the blue and red/blue displays. and worst with the red displays. The describing-function system analysis showed more variation with the red displays. The crossover frequencies were lowest with the red displays and highest with the combined red/blue displays, which provided the best overall tracking, performance. Describing-function performance measures, vertical performance measures, and pilot opinion support the hypothesis that specific colors in displays can influence the pilots' control characteristics during the final approach.

  19. Analysis of Average Telomere Length in Human Telomeric Protein Knockout Cells Generated by CRISPR/Cas9.

    PubMed

    Xu, Jun; Songyang, Zhou; Liu, Dan; Kim, Hyeung

    2017-01-01

    Telomeres play an important role in ensuring the integrity of the genome. Telomere shortening can lead to loss of genetic information and trigger DNA damage responses. Cultured mammalian cells have served as critical model systems for studying the function of telomere binding proteins and telomerase. Tremendous heterogeneity can be observed both between species and within a single cell population. Recent advances in genome editing (such as the development of the CRISPR/Cas9 platform) have further enabled researchers to carry out loss-of-function analysis of how disrupting key players in telomere maintenance affects telomere length regulation. Here we describe the steps to be carried out in order to analyze the average length of telomeres in CRISPR-engineered human knockout (KO) cells (TRF analysis).

  20. [Construction and application of special analysis database of geoherbs based on 3S technology].

    PubMed

    Guo, Lan-ping; Huang, Lu-qi; Lv, Dong-mei; Shao, Ai-juan; Wang, Jian

    2007-09-01

    In this paper,the structures, data sources, data codes of "the spacial analysis database of geoherbs" based 3S technology are introduced, and the essential functions of the database, such as data management, remote sensing, spacial interpolation, spacial statistics, spacial analysis and developing are described. At last, two examples for database usage are given, the one is classification and calculating of NDVI index of remote sensing image in geoherbal area of Atractylodes lancea, the other one is adaptation analysis of A. lancea. These indicate that "the spacial analysis database of geoherbs" has bright prospect in spacial analysis of geoherbs.

  1. Structure of force networks in tapped particulate systems of disks and pentagons. II. Persistence analysis.

    PubMed

    Kondic, L; Kramár, M; Pugnaloni, Luis A; Carlevaro, C Manuel; Mischaikow, K

    2016-06-01

    In the companion paper [Pugnaloni et al., Phys. Rev. E 93, 062902 (2016)10.1103/PhysRevE.93.062902], we use classical measures based on force probability density functions (PDFs), as well as Betti numbers (quantifying the number of components, related to force chains, and loops), to describe the force networks in tapped systems of disks and pentagons. In the present work, we focus on the use of persistence analysis, which allows us to describe these networks in much more detail. This approach allows us not only to describe but also to quantify the differences between the force networks in different realizations of a system, in different parts of the considered domain, or in different systems. We show that persistence analysis clearly distinguishes the systems that are very difficult or impossible to differentiate using other means. One important finding is that the differences in force networks between disks and pentagons are most apparent when loops are considered: the quantities describing properties of the loops may differ significantly even if other measures (properties of components, Betti numbers, force PDFs, or the stress tensor) do not distinguish clearly or at all the investigated systems.

  2. Analysis of Expandability and Modifiability of Computer Configuration Concepts for ATC : Volume I, Distributed Concept

    DOT National Transportation Integrated Search

    1979-11-01

    The questions of expandability and modifiability of a 1990-era Air Traffic Control (ATC) system are addressed. Two strawman systems are described at the functional level: a Baseline System, which represents the ATC system as it might be just after th...

  3. Phylogenetic and Protein Sequence Analysis of Bacterial Chemoreceptors.

    PubMed

    Ortega, Davi R; Zhulin, Igor B

    2018-01-01

    Identifying chemoreceptors in sequenced bacterial genomes, revealing their domain architecture, inferring their evolutionary relationships, and comparing them to chemoreceptors of known function become important steps in genome annotation and chemotaxis research. Here, we describe bioinformatics procedures that enable such analyses, using two closely related bacterial genomes as examples.

  4. The genomic landscape of rapid, repeated evolutionary rescue from toxic pollution in wild fish

    USDA-ARS?s Scientific Manuscript database

    Here we describe evolutionary rescue from intense pollution via multiple modes of selection in killifish populations from 4 urban estuaries of the US eastern seaboard. Comparative transcriptomics and analysis of 384 whole genome sequences show that the functioning of a receptor-based signaling pathw...

  5. African-American Interpersonal Relationships: Dating Preferences.

    ERIC Educational Resources Information Center

    Lang, Gina M.; And Others

    Dating values are salient issues with regard to interpersonal relationships. They provide the basis for the relationship between two people that may help determine how the family will ultimately function. This study is a preliminary analysis that attempted to describe how dating preferences of African-Americans differ with respect to gender and…

  6. Chaparral & Fire Ecology: Role of Fire in Seed Germination.

    ERIC Educational Resources Information Center

    Steele, Nancy L. C.; Keeley, Jon E.

    1991-01-01

    An activity that incorporates the concepts of plant structure and function and ecology is described. Students investigate the reasons why some California chaparral seeds germinate only after a fire has burned the surrounding chaparral. The procedure, discussion and analysis questions, expected results, potential problems, and additional activities…

  7. Theoretical Definition of Instructor Role in Computer-Managed Instruction.

    ERIC Educational Resources Information Center

    McCombs, Barbara L.; Dobrovolny, Jacqueline L.

    This report describes the results of a theoretical analysis of the ideal role functions of the Computer Managed Instruction (CMI) instructor. Concepts relevant to instructor behavior are synthesized from both cognitive and operant learning theory perspectives, and the roles allocated to instructors by seven large-scale operational CMI systems are…

  8. Intrapersonal and Familial Effects of Child Sexual Abuse on Female Partners of Male Survivors

    ERIC Educational Resources Information Center

    Jacob, Christine M. Anderson; Veach, Patricia McCarthy

    2005-01-01

    Intrapersonal and familial effects of childhood sexual abuse (CSA) were investigated by interviewing 10 female partners of male survivors. Consensual qualitative research analysis (C. Hill, B. Thompson, & E. Nutt Williams, 1997) yielded 13 domains describing male partner, female partner, couple, and family functioning. Findings concerning…

  9. Literacy Instruction in Canadian Child Care Centers

    ERIC Educational Resources Information Center

    Perlman, Michal; Fletcher, Brooke A.

    2008-01-01

    The purpose of this study was to describe literacy instruction in child care centers, examine aspects of child care center quality that may predict such instruction, and provide a limited analysis of whether literacy instruction impacts children's concurrent pre-academic functioning. Staff and children in 103 classrooms serving preschool-age…

  10. Equity and Entrepreneurialism: The Impact of Tax Increment Financing on School Finance.

    ERIC Educational Resources Information Center

    Weber, Rachel

    2003-01-01

    Describes tax increment financing (TIF), an entrepreneurial strategy with significant fiscal implications for overlapping taxing jurisdictions that provide these functions. Statistical analysis of TIF's impact on the finances of one Illinois county's school districts indicates that municipal use of TIF depletes the property tax revenues of schools…

  11. English for Airport Ground Staff

    ERIC Educational Resources Information Center

    Cutting, Joan

    2012-01-01

    This article describes part of a European Commission Leonardo project that aimed to design a multimedia course for English language learners seeking work as ground staff in European airports. The structural-functional analysis of the dialogues written from the course showed that, across the four trades explored (security guards, ground handlers,…

  12. Validated Test Method 1316: Liquid-Solid Partitioning as a Function of Liquid-to-Solid Ratio in Solid Materials Using a Parallel Batch Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  13. Counselor and Student at Talk: A Case Study.

    ERIC Educational Resources Information Center

    He, Agnes Weiyun; Keating, Elizabeth

    1991-01-01

    Explores ways in which expert and novice roles are constituted and maintained in an academic counseling encounter. Characterizes the meeting as a socializing, problem-solving event and uses functional linguistics and discourse analysis to describe how the counselor and student mark stance through linguistic choices such as polarity, modality,…

  14. Une Unite Discursive Restreinte: le Titre (A Restricted Discourse Unit: The Title).

    ERIC Educational Resources Information Center

    Vigner, Gerard

    1980-01-01

    Describes the functions, specific uses, syntactic structure, and typographical characteristics of titles, discussing examples from newspapers, books, films, and scientific journals. Analysis of the semantic relationship between title and text is followed by the description of various instructional techniques for the production of titles and the…

  15. Through the Looking Glass: Symmetry in Behavioral Principles?

    ERIC Educational Resources Information Center

    Marr, M. Jackson

    2006-01-01

    In this article, the author discusses and presents seven possibilities that describe how symmetry principles are reflected in behavior analysis. First, if there are apparently no functional distinctions to be made between positive and negative reinforcement, then reinforcer effectiveness (by various measures) is invariant under a simple inversion…

  16. Burst and Principal Components Analyses of MEA Data for 16 Chemicals Describe at Least Three Effects Classes.

    EPA Science Inventory

    Microelectrode arrays (MEAs) detect drug and chemical induced changes in neuronal network function and have been used for neurotoxicity screening. As a proof-•of-concept, the current study assessed the utility of analytical "fingerprinting" using Principal Components Analysis (P...

  17. Decision-Making Theory Applied to Architectural Programming: Some Research Implications.

    ERIC Educational Resources Information Center

    Green, Meg

    The implications of delineating and determining the sequence of programming decisions are shown in the selection of building committee membership. The role relationships of client and architect are discussed in terms of decision-making function. Decision tables are described as aids in problem analysis. Other topics include information and…

  18. Further Evaluation of Emerging Speech in Children with Developmental Disabilities: Training Verbal Behavior

    ERIC Educational Resources Information Center

    Kelley, M. E.; Shillingsburg, M.A.; Castro, M. J.; Addison, L. R.; LaRue, R. H., Jr.

    2007-01-01

    The conceptual basis for many effective language-training programs are based on Skinner's (1957) analysis of verbal behavior. Skinner described several elementary verbal operants including mands, tacts, intraverbals, and echoics. According to Skinner, responses that are the same topography may actually be functionally independent. Previous…

  19. Learning in the Making: A Comparative Case Study of Three Makerspaces

    ERIC Educational Resources Information Center

    Sheridan, Kimberly M.; Halverson, Erica Rosenfeld; Litts, Breanne K.; Brahms, Lisa; Jacobs-Priebe, Lynette; Owens, Trevor

    2014-01-01

    Through a comparative case study, Sheridan and colleagues explore how makerspaces may function as learning environments. Drawing on field observations, interviews, and analysis of artifacts, videos, and other documents, the authors describe features of three makerspaces and how participants learn and develop through complex design and making…

  20. The TROPOMI Telescope

    NASA Astrophysics Data System (ADS)

    Nijkerk, David; van Venrooy, Bart; Van Doorn, Peter; Henselmans, Rens; Draaisma, Folkert; Hoogstrate, André

    2017-11-01

    In this paper, we discuss the two-mirror pushbroom telescope for TROPOMI. Using freeform optics, it has unprecedented resolution. The complete cycle of freeform optical design, analysis, manufacturing, metrology and functional test on a breadboard setup is described, focusing on the specific complexities concerning freeforms. The TROPOMI flight telescope will be manufactured in summer 2012.

  1. Infant Sign Training and Functional Analysis

    ERIC Educational Resources Information Center

    Normand, Matthew P.; Machado, Mychal A.; Hustyi, Kristin M.; Morley, Allison J.

    2011-01-01

    We taught manual signs to typically developing infants using a reversal design and caregiver-nominated stimuli. We delivered the stimuli on a time-based schedule during baseline. During the intervention, we used progressive prompting and reinforcement, described by Thompson et al. (2004, 2007), to establish mands. Following sign training, we…

  2. Analysis of autostereoscopic three-dimensional images using multiview wavelets.

    PubMed

    Saveljev, Vladimir; Palchikova, Irina

    2016-08-10

    We propose that multiview wavelets can be used in processing multiview images. The reference functions for the synthesis/analysis of multiview images are described. The synthesized binary images were observed experimentally as three-dimensional visual images. The symmetric multiview B-spline wavelets are proposed. The locations recognized in the continuous wavelet transform correspond to the layout of the test objects. The proposed wavelets can be applied to the multiview, integral, and plenoptic images.

  3. Stretched hydrogen molecule from a constrained-search density-functional perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valone, Steven M; Levy, Mel

    2009-01-01

    Constrained-search density functional theory gives valuable insights into the fundamentals of density functional theory. It provides exact results and bounds on the ground- and excited-state density functionals. An important advantage of the theory is that it gives guidance in the construction of functionals. Here they engage constrained search theory to explore issues associated with the functional behavior of 'stretched bonds' in molecular hydrogen. A constrained search is performed with familiar valence bond wavefunctions ordinarily used to describe molecular hydrogen. The effective, one-electron hamiltonian is computed and compared to the corresponding uncorrelated, Hartree-Fock effective hamiltonian. Analysis of the functional suggests themore » need to construct different functionals for the same density and to allow a competition among these functions. As a result the correlation energy functional is composed explicitly of energy gaps from the different functionals.« less

  4. Polarimetric receiver in the forty gigahertz band: new instrument for the Q-U-I joint Tenerife experiment

    NASA Astrophysics Data System (ADS)

    Villa, Enrique; Cano, Juan L.; Aja, Beatriz; Terán, J. Vicente; de la Fuente, Luisa; Mediavilla, Ángel; Artal, Eduardo

    2018-03-01

    This paper describes the analysis, design and characterization of a polarimetric receiver developed for covering the 35 to 47 GHz frequency band in the new instrument aimed at completing the ground-based Q-U-I Joint Tenerife Experiment. This experiment is designed to measure polarization in the Cosmic Microwave Background. The described high frequency instrument is a HEMT-based array composed of 29 pixels. A thorough analysis of the behaviour of the proposed receiver, based on electronic phase switching, is presented for a noise-like linearly polarized input signal, obtaining simultaneously I, Q and U Stokes parameters of the input signal. Wideband subsystems are designed, assembled and characterized for the polarimeter. Their performances are described showing appropriate results within the 35-to-47 GHz frequency band. Functionality tests are performed at room and cryogenic temperatures with adequate results for both temperature conditions, which validate the receiver concept and performance.

  5. Watershed analysis of the Salmon River watershed, Washington : hydrology

    USGS Publications Warehouse

    Bidlake, William R.

    2003-01-01

    The U.S. Geological Survey analyzed selected hydrologic conditions as part of a watershed analysis of the Salmon River watershed, Washington, conducted by the Quinault Indian Nation. The selected hydrologic conditions were analyzed according to a framework of hydrologic key questions that were identified for the watershed. The key questions were posed to better understand the natural, physical, and biological features of the watershed that control hydrologic responses; to better understand current streamflow characteristics, including peak and low flows; to describe any evidence that forest harvesting and road construction have altered frequency and magnitude of peak and low flows within the watershed; to describe what is currently known about the distribution and extent of wetlands and any impacts of land management activities on wetlands; and to describe how hydrologic monitoring within the watershed might help to detect future hydrologic change, to preserve critical ecosystem functions, and to protect public and private property.

  6. GAT: a graph-theoretical analysis toolbox for analyzing between-group differences in large-scale structural and functional brain networks.

    PubMed

    Hosseini, S M Hadi; Hoeft, Fumiko; Kesler, Shelli R

    2012-01-01

    In recent years, graph theoretical analyses of neuroimaging data have increased our understanding of the organization of large-scale structural and functional brain networks. However, tools for pipeline application of graph theory for analyzing topology of brain networks is still lacking. In this report, we describe the development of a graph-analysis toolbox (GAT) that facilitates analysis and comparison of structural and functional network brain networks. GAT provides a graphical user interface (GUI) that facilitates construction and analysis of brain networks, comparison of regional and global topological properties between networks, analysis of network hub and modules, and analysis of resilience of the networks to random failure and targeted attacks. Area under a curve (AUC) and functional data analyses (FDA), in conjunction with permutation testing, is employed for testing the differences in network topologies; analyses that are less sensitive to the thresholding process. We demonstrated the capabilities of GAT by investigating the differences in the organization of regional gray-matter correlation networks in survivors of acute lymphoblastic leukemia (ALL) and healthy matched Controls (CON). The results revealed an alteration in small-world characteristics of the brain networks in the ALL survivors; an observation that confirm our hypothesis suggesting widespread neurobiological injury in ALL survivors. Along with demonstration of the capabilities of the GAT, this is the first report of altered large-scale structural brain networks in ALL survivors.

  7. Extracting a shape function for a signal with intra-wave frequency modulation.

    PubMed

    Hou, Thomas Y; Shi, Zuoqiang

    2016-04-13

    In this paper, we develop an effective and robust adaptive time-frequency analysis method for signals with intra-wave frequency modulation. To handle this kind of signals effectively, we generalize our data-driven time-frequency analysis by using a shape function to describe the intra-wave frequency modulation. The idea of using a shape function in time-frequency analysis was first proposed by Wu (Wu 2013 Appl. Comput. Harmon. Anal. 35, 181-199. (doi:10.1016/j.acha.2012.08.008)). A shape function could be any smooth 2π-periodic function. Based on this model, we propose to solve an optimization problem to extract the shape function. By exploring the fact that the shape function is a periodic function with respect to its phase function, we can identify certain low-rank structure of the signal. This low-rank structure enables us to extract the shape function from the signal. Once the shape function is obtained, the instantaneous frequency with intra-wave modulation can be recovered from the shape function. We demonstrate the robustness and efficiency of our method by applying it to several synthetic and real signals. One important observation is that this approach is very stable to noise perturbation. By using the shape function approach, we can capture the intra-wave frequency modulation very well even for noise-polluted signals. In comparison, existing methods such as empirical mode decomposition/ensemble empirical mode decomposition seem to have difficulty in capturing the intra-wave modulation when the signal is polluted by noise. © 2016 The Author(s).

  8. GADRAS Detector Response Function.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Dean J.; Harding, Lee; Thoreson, Gregory G

    2014-11-01

    The Gamma Detector Response and Analysis Software (GADRAS) applies a Detector Response Function (DRF) to compute the output of gamma-ray and neutron detectors when they are exposed to radiation sources. The DRF is fundamental to the ability to perform forward calculations (i.e., computation of the response of a detector to a known source), as well as the ability to analyze spectra to deduce the types and quantities of radioactive material to which the detectors are exposed. This document describes how gamma-ray spectra are computed and the significance of response function parameters that define characteristics of particular detectors.

  9. Umbral Calculus and Holonomic Modules in Positive Characteristic

    NASA Astrophysics Data System (ADS)

    Kochubei, Anatoly N.

    2006-03-01

    In the framework of analysis over local fields of positive characteristic, we develop algebraic tools for introducing and investigating various polynomial systems. In this survey paper we describe a function field version of umbral calculus developed on the basis of a relation of binomial type satisfied by the Carlitz polynomials. We consider modules over the Weyl-Carlitz ring, a function field counterpart of the Weyl algebra. It is shown that some basic objects of function field arithmetic, like the Carlitz module, Thakur's hypergeometric polynomials, and analogs of binomial coefficients arising in the positive characteristic version of umbral calculus, generate holonomic modules.

  10. SIENA Customer Problem Statement and Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. Sauer; R. Clay; C. Adams

    2000-08-01

    This document describes the problem domain and functional requirements of the SIENA framework. The software requirements and system architecture of SIENA are specified in separate documents (called SIENA Software Requirement Specification and SIENA Software Architecture, respectively). While currently this version of the document describes the problems and captures the requirements within the Analysis domain (concentrating on finite element models), it is our intention to subsequent y expand this document to describe problems and capture requirements from the Design and Manufacturing domains. In addition, SIENA is designed to be extendible to support and integrate elements from the other domains (see SIENAmore » Software Architecture document).« less

  11. Exponential Family Functional data analysis via a low-rank model.

    PubMed

    Li, Gen; Huang, Jianhua Z; Shen, Haipeng

    2018-05-08

    In many applications, non-Gaussian data such as binary or count are observed over a continuous domain and there exists a smooth underlying structure for describing such data. We develop a new functional data method to deal with this kind of data when the data are regularly spaced on the continuous domain. Our method, referred to as Exponential Family Functional Principal Component Analysis (EFPCA), assumes the data are generated from an exponential family distribution, and the matrix of the canonical parameters has a low-rank structure. The proposed method flexibly accommodates not only the standard one-way functional data, but also two-way (or bivariate) functional data. In addition, we introduce a new cross validation method for estimating the latent rank of a generalized data matrix. We demonstrate the efficacy of the proposed methods using a comprehensive simulation study. The proposed method is also applied to a real application of the UK mortality study, where data are binomially distributed and two-way functional across age groups and calendar years. The results offer novel insights into the underlying mortality pattern. © 2018, The International Biometric Society.

  12. Mass Spectrometry Analysis of Spatial Protein Networks by Colocalization Analysis (COLA).

    PubMed

    Mardakheh, Faraz K

    2017-01-01

    A major challenge in systems biology is comprehensive mapping of protein interaction networks. Crucially, such interactions are often dynamic in nature, necessitating methods that can rapidly mine the interactome across varied conditions and treatments to reveal change in the interaction networks. Recently, we described a fast mass spectrometry-based method to reveal functional interactions in mammalian cells on a global scale, by revealing spatial colocalizations between proteins (COLA) (Mardakheh et al., Mol Biosyst 13:92-105, 2017). As protein localization and function are inherently linked, significant colocalization between two proteins is a strong indication for their functional interaction. COLA uses rapid complete subcellular fractionation, coupled with quantitative proteomics to generate a subcellular localization profile for each protein quantified by the mass spectrometer. Robust clustering is then applied to reveal significant similarities in protein localization profiles, indicative of colocalization.

  13. Shape information from a critical point analysis of calculated electron density maps: application to DNA-drug systems

    NASA Astrophysics Data System (ADS)

    Leherte, L.; Allen, F. H.; Vercauteren, D. P.

    1995-04-01

    A computational method is described for mapping the volume within the DNA double helix accessible to a groove-binding antibiotic, netropsin. Topological critical point analysis is used to locate maxima in electron density maps reconstructed from crystallographically determined atomic coordinates. The peaks obtained in this way are represented as ellipsoids with axes related to local curvature of the electron density function. Combining the ellipsoids produces a single electron density function which can be probed to estimate effective volumes of the interacting species. Close complementarity between host and ligand in this example shows the method to be a good representation of the electron density function at various resolutions; while at the atomic level the ellipsoid method gives results which are in close agreement with those from the conventional, spherical, van der Waals approach.

  14. Shape information from a critical point analysis of calculated electron density maps: Application to DNA-drug systems

    NASA Astrophysics Data System (ADS)

    Leherte, Laurence; Allen, Frank H.

    1994-06-01

    A computational method is described for mapping the volume within the DNA double helix accessible to the groove-binding antibiotic netropsin. Topological critical point analysis is used to locate maxima in electron density maps reconstructed from crystallographically determined atomic coordinates. The peaks obtained in this way are represented as ellipsoids with axes related to local curvature of the electron density function. Combining the ellipsoids produces a single electron density function which can be probed to estimate effective volumes of the interacting species. Close complementarity between host and ligand in this example shows the method to give a good representation of the electron density function at various resolutions. At the atomic level, the ellipsoid method gives results which are in close agreement with those from the conventional spherical van der Waals approach.

  15. Genetic variant for behavioral regulation factor of executive function and its possible brain mechanism in attention deficit hyperactivity disorder.

    PubMed

    Sun, Xiao; Wu, Zhaomin; Cao, Qingjiu; Qian, Ying; Liu, Yong; Yang, Binrang; Chang, Suhua; Yang, Li; Wang, Yufeng

    2018-05-16

    As a childhood-onset psychiatric disorder, attention deficit hyperactivity disorder (ADHD) is complicated by phenotypic and genetic heterogeneity. Lifelong executive function deficits in ADHD are described in many literatures and have been proposed as endophenotypes of ADHD. However, its genetic basis is still elusive. In this study, we performed a genome-wide association study of executive function, rated with Behavioral Rating Inventory of Executive Function (BRIEF), in ADHD children. We identified one significant variant (rs852004, P = 2.51e-08) for the overall score of BRIEF. The association analyses for each component of executive function found this locus was more associated with inhibit and monitor components. Further principle component analysis and confirmatory factor analysis provided an ADHD-specific executive function pattern including inhibit and monitor factors. SNP rs852004 was mainly associated with the Behavioral Regulation factor. Meanwhile, we found the significant locus was associated with ADHD symptom. The Behavioral Regulation factor mediated its effect on ADHD symptom. Functional magnetic resonance imaging (fMRI) analyses further showed evidence that this variant affected the activity of inhibition control related brain regions. It provided new insights for the genetic basis of executive function in ADHD.

  16. PRISM: Processing routines in IDL for spectroscopic measurements (installation manual and user's guide, version 1.0)

    USGS Publications Warehouse

    Kokaly, Raymond F.

    2011-01-01

    This report describes procedures for installing and using the U.S. Geological Survey Processing Routines in IDL for Spectroscopic Measurements (PRISM) software. PRISM provides a framework to conduct spectroscopic analysis of measurements made using laboratory, field, airborne, and space-based spectrometers. Using PRISM functions, the user can compare the spectra of materials of unknown composition with reference spectra of known materials. This spectroscopic analysis allows the composition of the material to be identified and characterized. Among its other functions, PRISM contains routines for the storage of spectra in database files, import/export of ENVI spectral libraries, importation of field spectra, correction of spectra to absolute reflectance, arithmetic operations on spectra, interactive continuum removal and comparison of spectral features, correction of imaging spectrometer data to ground-calibrated reflectance, and identification and mapping of materials using spectral feature-based analysis of reflectance data. This report provides step-by-step instructions for installing the PRISM software and running its functions.

  17. Integrated multidisciplinary optimization of rotorcraft: A plan for development

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M. (Editor); Mantay, Wayne R. (Editor)

    1989-01-01

    This paper describes a joint NASA/Army initiative at the Langley Research Center to develop optimization procedures aimed at improving the rotor blade design process by integrating appropriate disciplines and accounting for important interactions among the disciplines. The paper describes the optimization formulation in terms of the objective function, design variables, and constraints. Additionally, some of the analysis aspects are discussed, validation strategies are described, and an initial attempt at defining the interdisciplinary couplings is summarized. At this writing, significant progress has been made, principally in the areas of single discipline optimization. Accomplishments are described in areas of rotor aerodynamic performance optimization for minimum hover horsepower, rotor dynamic optimization for vibration reduction, and rotor structural optimization for minimum weight.

  18. Thermal power systems small power systems applications project. Decision analysis for evaluating and ranking small solar thermal power system technologies. Volume 1: A brief introduction to multiattribute decision analysis. [explanation of multiattribute decision analysis methods used in evaluating alternatives for small powered systems

    NASA Technical Reports Server (NTRS)

    Feinberg, A.; Miles, R. F., Jr.

    1978-01-01

    The principal concepts of the Keeney and Raiffa approach to multiattribute decision analysis are described. Topics discussed include the concepts of decision alternatives, outcomes, objectives, attributes and their states, attribute utility functions, and the necessary independence properties for the attribute states to be aggregated into a numerical representation of the preferences of the decision maker for the outcomes and decision alternatives.

  19. Perception and psychological evaluation for visual and auditory environment based on the correlation mechanisms

    NASA Astrophysics Data System (ADS)

    Fujii, Kenji

    2002-06-01

    In this dissertation, the correlation mechanism in modeling the process in the visual perception is introduced. It has been well described that the correlation mechanism is effective for describing subjective attributes in auditory perception. The main result is that it is possible to apply the correlation mechanism to the process in temporal vision and spatial vision, as well as in audition. (1) The psychophysical experiment was performed on subjective flicker rates for complex waveforms. A remarkable result is that the phenomenon of missing fundamental is found in temporal vision as analogous to the auditory pitch perception. This implies the existence of correlation mechanism in visual system. (2) For spatial vision, the autocorrelation analysis provides useful measures for describing three primary perceptual properties of visual texture: contrast, coarseness, and regularity. Another experiment showed that the degree of regularity is a salient cue for texture preference judgment. (3) In addition, the autocorrelation function (ACF) and inter-aural cross-correlation function (IACF) were applied for analysis of the temporal and spatial properties of environmental noise. It was confirmed that the acoustical properties of aircraft noise and traffic noise are well described. These analyses provided useful parameters extracted from the ACF and IACF in assessing the subjective annoyance for noise. Thesis advisor: Yoichi Ando Copies of this thesis written in English can be obtained from Junko Atagi, 6813 Mosonou, Saijo-cho, Higashi-Hiroshima 739-0024, Japan. E-mail address: atagi\\@urban.ne.jp.

  20. A multi-tissue type genome-scale metabolic network for analysis of whole-body systems physiology

    PubMed Central

    2011-01-01

    Background Genome-scale metabolic reconstructions provide a biologically meaningful mechanistic basis for the genotype-phenotype relationship. The global human metabolic network, termed Recon 1, has recently been reconstructed allowing the systems analysis of human metabolic physiology and pathology. Utilizing high-throughput data, Recon 1 has recently been tailored to different cells and tissues, including the liver, kidney, brain, and alveolar macrophage. These models have shown utility in the study of systems medicine. However, no integrated analysis between human tissues has been done. Results To describe tissue-specific functions, Recon 1 was tailored to describe metabolism in three human cells: adipocytes, hepatocytes, and myocytes. These cell-specific networks were manually curated and validated based on known cellular metabolic functions. To study intercellular interactions, a novel multi-tissue type modeling approach was developed to integrate the metabolic functions for the three cell types, and subsequently used to simulate known integrated metabolic cycles. In addition, the multi-tissue model was used to study diabetes: a pathology with systemic properties. High-throughput data was integrated with the network to determine differential metabolic activity between obese and type II obese gastric bypass patients in a whole-body context. Conclusion The multi-tissue type modeling approach presented provides a platform to study integrated metabolic states. As more cell and tissue-specific models are released, it is critical to develop a framework in which to study their interdependencies. PMID:22041191

  1. A FRMD7 variant in a Japanese family causes congenital nystagmus.

    PubMed

    Kohmoto, Tomohiro; Okamoto, Nana; Satomura, Shigeko; Naruto, Takuya; Komori, Takahide; Hashimoto, Toshiaki; Imoto, Issei

    2015-01-01

    Idiopathic congenital nystagmus (ICN) is a genetically heterogeneous eye movement disorder that causes a large proportion of childhood visual impairment. Here we describe a missense variant (p.L292P) within a mutation-rich region of FRMD7 detected in three affected male siblings in a Japanese family with X-linked ICN. Combining sequence analysis and results from structural and functional predictions, we report p.L292P as a variant potentially disrupting FRMD7 function associated with X-linked ICN.

  2. A FRMD7 variant in a Japanese family causes congenital nystagmus

    PubMed Central

    Kohmoto, Tomohiro; Okamoto, Nana; Satomura, Shigeko; Naruto, Takuya; Komori, Takahide; Hashimoto, Toshiaki; Imoto, Issei

    2015-01-01

    Idiopathic congenital nystagmus (ICN) is a genetically heterogeneous eye movement disorder that causes a large proportion of childhood visual impairment. Here we describe a missense variant (p.L292P) within a mutation-rich region of FRMD7 detected in three affected male siblings in a Japanese family with X-linked ICN. Combining sequence analysis and results from structural and functional predictions, we report p.L292P as a variant potentially disrupting FRMD7 function associated with X-linked ICN. PMID:27081518

  3. Limits in the application of harmonic analysis to pulsating stars

    NASA Astrophysics Data System (ADS)

    Pascual-Granado, J.; Garrido, R.; Suárez, J. C.

    2015-09-01

    Using ultra-precise data from space instrumentation, we found that the underlying functions of stellar light curves from some AF pulsating stars are non-analytic, and consequently their Fourier expansion is not guaranteed. This result demonstrates that periodograms do not provide a mathematically consistent estimator of the frequency content for this type of variable stars. More importantly, this constitutes the first counterexample against the current paradigm, which considers that any physical process is described by a continuous (band-limited) function that is infinitely differentiable.

  4. Added Value of Assessing Adnexal Masses with Advanced MRI Techniques

    PubMed Central

    Thomassin-Naggara, I.; Balvay, D.; Rockall, A.; Carette, M. F.; Ballester, M.; Darai, E.; Bazot, M.

    2015-01-01

    This review will present the added value of perfusion and diffusion MR sequences to characterize adnexal masses. These two functional MR techniques are readily available in routine clinical practice. We will describe the acquisition parameters and a method of analysis to optimize their added value compared with conventional images. We will then propose a model of interpretation that combines the anatomical and morphological information from conventional MRI sequences with the functional information provided by perfusion and diffusion weighted sequences. PMID:26413542

  5. Studies of radar backscatter as a function of wave properties and the winds in the turbulent marine atmosphere

    NASA Technical Reports Server (NTRS)

    Pierson, Willard J., Jr.; Sylvester, Winfield B.

    1995-01-01

    The research on model functions for ADEOS and ERS-1 are summarized and an analysis of the differences between the three kinds of models is provided in this final report. The success of the AMI on ERS-1 obtained at GSFC and NMC is highlighted. The problem of wind stress description is reviewed within and the scatterometer model being developed for high winds monitoring for the AMI on ERS-1 and ERS-2 is described.

  6. Adding results to a meta-analysis: Theory and example

    NASA Astrophysics Data System (ADS)

    Willson, Victor L.

    Meta-analysis has been used as a research method to describe bodies of research data. It promotes hypothesis formation and the development of science education laws. A function overlooked, however, is the role it plays in updating research. Methods to integrate new research with meta-analysis results need explication. A procedure is presented using Bayesian analysis. Research in science education attitude correlation with achievement has been published after a recent meta-analysis of the topic. The results show how new findings complement the previous meta-analysis and extend its conclusions. Additional methodological questions adddressed are how studies are to be weighted, which variables are to be examined, and how often meta-analysis are to be updated.

  7. Distributed Contour Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Weber, Gunther H.

    2014-03-31

    Topological techniques provide robust tools for data analysis. They are used, for example, for feature extraction, for data de-noising, and for comparison of data sets. This chapter concerns contour trees, a topological descriptor that records the connectivity of the isosurfaces of scalar functions. These trees are fundamental to analysis and visualization of physical phenomena modeled by real-valued measurements. We study the parallel analysis of contour trees. After describing a particular representation of a contour tree, called local{global representation, we illustrate how di erent problems that rely on contour trees can be solved in parallel with minimal communication.

  8. Structural analysis of cell wall polysaccharides using PACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mortimer, Jennifer C.

    The plant cell wall is composed of many complex polysaccharides. The composition and structure of the polysaccharides affect various cell properties including cell shape, cell function and cell adhesion. Many techniques to characterize polysaccharide structure are complicated, requiring expensive equipment and specialized operators e.g. NMR, MALDI-MS. PACE (Polysaccharide Analysis using Carbohydrate gel Electrophoresis) uses a simple, rapid technique to analyze polysaccharide quantity and structure (Goubet et al. 2002). Whilst the method here describes xylan analysis, it can be applied (by use of the appropriate glycosyl hydrolase) to any cell wall polysaccharide.

  9. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  10. The Papillomavirus E2 proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McBride, Alison A., E-mail: amcbride@nih.gov

    2013-10-15

    The papillomavirus E2 proteins are pivotal to the viral life cycle and have well characterized functions in transcriptional regulation, initiation of DNA replication and partitioning the viral genome. The E2 proteins also function in vegetative DNA replication, post-transcriptional processes and possibly packaging. This review describes structural and functional aspects of the E2 proteins and their binding sites on the viral genome. It is intended to be a reference guide to this viral protein. - Highlights: • Overview of E2 protein functions. • Structural domains of the papillomavirus E2 proteins. • Analysis of E2 binding sites in different genera of papillomaviruses.more » • Compilation of E2 associated proteins. • Comparison of key mutations in distinct E2 functions.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helander, Sara; Montecchio, Meri; Lemak, Alexander

    Highlights: • We describe the structure of a novel fold in FKBP25 and HectD. • The new fold is named the Basic Tilted Helix Bundle (BTHB) domain. • A conserved basic surface patch is presented, suggesting a functional role. - Abstract: In this paper, we describe the structure of a N-terminal domain motif in nuclear-localized FKBP25{sub 1–73}, a member of the FKBP family, together with the structure of a sequence-related subdomain of the E3 ubiquitin ligase HectD1 that we show belongs to the same fold. This motif adopts a compact 5-helix bundle which we name the Basic Tilted Helix Bundlemore » (BTHB) domain. A positively charged surface patch, structurally centered around the tilted helix H4, is present in both FKBP25 and HectD1 and is conserved in both proteins, suggesting a conserved functional role. We provide detailed comparative analysis of the structures of the two proteins and their sequence similarities, and analysis of the interaction of the proposed FKBP25 binding protein YY1. We suggest that the basic motif in BTHB is involved in the observed DNA binding of FKBP25, and that the function of this domain can be affected by regulatory YY1 binding and/or interactions with adjacent domains.« less

  12. The science of rotator cuff tears: translating animal models to clinical recommendations using simulation analysis.

    PubMed

    Mannava, Sandeep; Plate, Johannes F; Tuohy, Christopher J; Seyler, Thorsten M; Whitlock, Patrick W; Curl, Walton W; Smith, Thomas L; Saul, Katherine R

    2013-07-01

    The purpose of this article is to review basic science studies using various animal models for rotator cuff research and to describe structural, biomechanical, and functional changes to muscle following rotator cuff tears. The use of computational simulations to translate the findings from animal models to human scale is further detailed. A comprehensive review was performed of the basic science literature describing the use of animal models and simulation analysis to examine muscle function following rotator cuff injury and repair in the ageing population. The findings from various studies of rotator cuff pathology emphasize the importance of preventing permanent muscular changes with detrimental results. In vivo muscle function, electromyography, and passive muscle-tendon unit properties were studied before and after supraspinatus tenotomy in a rodent rotator cuff injury model (acute vs chronic). Then, a series of simulation experiments were conducted using a validated computational human musculoskeletal shoulder model to assess both passive and active tension of rotator cuff repairs based on surgical positioning. Outcomes of rotator cuff repair may be improved by earlier surgical intervention, with lower surgical repair tensions and fewer electromyographic neuromuscular changes. An integrated approach of animal experiments, computer simulation analyses, and clinical studies may allow us to gain a fundamental understanding of the underlying pathology and interpret the results for clinical translation.

  13. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.

  14. Applying behavior analysis to clinical problems: review and analysis of habit reversal.

    PubMed Central

    Miltenberger, R G; Fuqua, R W; Woods, D W

    1998-01-01

    This article provides a review and analysis of habit reversal, a multicomponent procedure developed by Azrin and Nunn (1973, 1974) for the treatment of nervous habits, tics, and stuttering. The article starts with a discussion of the behaviors treated with habit reversal, behavioral covariation among habits, and functional analysis and assessment of habits. Research on habit reversal and simplified versions of the procedure is then described. Next the article discusses the limitations of habit reversal and the evidence for its generality. The article concludes with an analysis of the behavioral processes involved in habit reversal and suggestions for future research. PMID:9757583

  15. Annual Cycle of Surface Longwave Radiation

    NASA Technical Reports Server (NTRS)

    Mlynczak, Pamela E.; Smith, G. Louis; Wilber, Anne C.; Stackhouse, Paul W.

    2011-01-01

    The annual cycles of upward and downward longwave fluxes at the Earth s surface are investigated by use of the NASA/GEWEX Surface Radiation Budget Data Set. Because of the immense difference between the heat capacity of land and ocean, the surface of Earth is partitioned into these two categories. Principal component analysis is used to quantify the annual cycles. Over land, the first principal component describes over 95% of the variance of the annual cycle of the upward and downward longwave fluxes. Over ocean the first term describes more than 87% of these annual cycles. Empirical orthogonal functions show the corresponding geographical distributions of these cycles. Phase plane diagrams of the annual cycles of upward longwave fluxes as a function of net shortwave flux show the thermal inertia of land and ocean.

  16. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  17. Compared effects of missense mutations in Very-Long-Chain Acyl-CoA Dehydrogenase deficiency: Combined analysis by structural, functional and pharmacological approaches.

    PubMed

    Gobin-Limballe, Stéphanie; McAndrew, Ryan P; Djouadi, Fatima; Kim, Jung-Ja; Bastin, Jean

    2010-05-01

    Very-Long-Chain Acyl-CoA Dehydrogenase deficiency (VLCADD) is an autosomal recessive disorder considered as one of the more common ss-oxidation defects, possibly associated with neonatal cardiomyopathy, infantile hepatic coma, or adult-onset myopathy. Numerous gene missense mutations have been described in these VLCADD phenotypes, but only few of them have been structurally and functionally analyzed, and the molecular basis of disease variability is still poorly understood. To address this question, we first analyzed fourteen disease-causing amino acid changes using the recently described crystal structure of VLCAD. The predicted effects varied from the replacement of amino acid residues lining the substrate binding cavity, involved in holoenzyme-FAD interactions or in enzyme dimerisation, predicted to have severe functional consequences, up to amino acid substitutions outside key enzyme domains or lying on near enzyme surface, with predicted milder consequences. These data were combined with functional analysis of residual fatty acid oxidation (FAO) and VLCAD protein levels in patient cells harboring these mutations, before and after pharmacological stimulation by bezafibrate. Mutations identified as detrimental to the protein structure in the 3-D model were generally associated to profound FAO and VLCAD protein deficiencies in the patient cells, however, some mutations affecting FAD binding or monomer-monomer interactions allowed a partial response to bezafibrate. On the other hand, bezafibrate restored near-normal FAO rates in some mutations predicted to have milder consequences on enzyme structure. Overall, combination of structural, biochemical, and pharmacological analysis allowed assessment of the relative severity of individual mutations, with possible applications for disease management and therapeutic approach. Copyright 2010 Elsevier B.V. All rights reserved.

  18. Generalized hydrodynamic correlations and fractional memory functions

    NASA Astrophysics Data System (ADS)

    Rodríguez, Rosalio F.; Fujioka, Jorge

    2015-12-01

    A fractional generalized hydrodynamic (GH) model of the longitudinal velocity fluctuations correlation, and its associated memory function, for a complex fluid is analyzed. The adiabatic elimination of fast variables introduces memory effects in the transport equations, and the dynamic of the fluctuations is described by a generalized Langevin equation with long-range noise correlations. These features motivate the introduction of Caputo time fractional derivatives and allows us to calculate analytic expressions for the fractional longitudinal velocity correlation function and its associated memory function. Our analysis eliminates a spurious constant term in the non-fractional memory function found in the non-fractional description. It also produces a significantly slower power-law decay of the memory function in the GH regime that reduces to the well-known exponential decay in the non-fractional Navier-Stokes limit.

  19. Integrated command, control, communications and computation system functional architecture

    NASA Technical Reports Server (NTRS)

    Cooley, C. G.; Gilbert, L. E.

    1981-01-01

    The functional architecture for an integrated command, control, communications, and computation system applicable to the command and control portion of the NASA End-to-End Data. System is described including the downlink data processing and analysis functions required to support the uplink processes. The functional architecture is composed of four elements: (1) the functional hierarchy which provides the decomposition and allocation of the command and control functions to the system elements; (2) the key system features which summarize the major system capabilities; (3) the operational activity threads which illustrate the interrelationahip between the system elements; and (4) the interfaces which illustrate those elements that originate or generate data and those elements that use the data. The interfaces also provide a description of the data and the data utilization and access techniques.

  20. Analysis of crew functions as an aid in Space Station interior layout

    NASA Technical Reports Server (NTRS)

    Steinberg, A. L.; Tullis, Thomas S.; Bied, Barbra

    1986-01-01

    The Space Station must be designed to facilitate all of the functions that its crew will perform, both on-duty and off-duty, as efficiently and comfortably as possible. This paper examines the functions to be performed by the Space Station crew in order to make inferences about the design of an interior layout that optimizes crew productivity. Twenty-seven crew functions were defined, as well as five criteria for assessing relationships among all pairs of those functions. Hierarchical clustering and multidimensional scaling techniques were used to visually summarize the relationships. A key result was the identification of two dimensions for describing the configuration of crew functions: 'Private-Public' and 'Group-Individual'. Seven specific recommendations for Space Station interior layout were derived from the analyses.

  1. Efficient and Flexible Computation of Many-Electron Wave Function Overlaps.

    PubMed

    Plasser, Felix; Ruckenbauer, Matthias; Mai, Sebastian; Oppel, Markus; Marquetand, Philipp; González, Leticia

    2016-03-08

    A new algorithm for the computation of the overlap between many-electron wave functions is described. This algorithm allows for the extensive use of recurring intermediates and thus provides high computational efficiency. Because of the general formalism employed, overlaps can be computed for varying wave function types, molecular orbitals, basis sets, and molecular geometries. This paves the way for efficiently computing nonadiabatic interaction terms for dynamics simulations. In addition, other application areas can be envisaged, such as the comparison of wave functions constructed at different levels of theory. Aside from explaining the algorithm and evaluating the performance, a detailed analysis of the numerical stability of wave function overlaps is carried out, and strategies for overcoming potential severe pitfalls due to displaced atoms and truncated wave functions are presented.

  2. Exposing the QCD Splitting Function with CMS Open Data.

    PubMed

    Larkoski, Andrew; Marzani, Simone; Thaler, Jesse; Tripathee, Aashish; Xue, Wei

    2017-09-29

    The splitting function is a universal property of quantum chromodynamics (QCD) which describes how energy is shared between partons. Despite its ubiquitous appearance in many QCD calculations, the splitting function cannot be measured directly, since it always appears multiplied by a collinear singularity factor. Recently, however, a new jet substructure observable was introduced which asymptotes to the splitting function for sufficiently high jet energies. This provides a way to expose the splitting function through jet substructure measurements at the Large Hadron Collider. In this Letter, we use public data released by the CMS experiment to study the two-prong substructure of jets and test the 1→2 splitting function of QCD. To our knowledge, this is the first ever physics analysis based on the CMS Open Data.

  3. MTF Analysis of LANDSAT-4 Thematic Mapper

    NASA Technical Reports Server (NTRS)

    Schowengerdt, R.

    1984-01-01

    A research program to measure the LANDSAT 4 Thematic Mapper (TM) modulation transfer function (MTF) is described. Measurement of a satellite sensor's MTF requires the use of a calibrated ground target, i.e., the spatial radiance distribution of the target must be known to a resolution at least four to five times greater than that of the system under test. A small reflective mirror or a dark light linear pattern such as line or edge, and relatively high resolution underflight imagery are used to calibrate the target. A technique that utilizes an analytical model for the scene spatial frequency power spectrum will be investigated as an alternative to calibration of the scene. The test sites and analysis techniques are also described.

  4. Integrated propulsion for near-Earth space missions. Volume 2: Technical

    NASA Technical Reports Server (NTRS)

    Dailey, C. L.; Meissinger, H. F.; Lovberg, R. H.; Zafran, S.

    1981-01-01

    The calculation approach is described for parametric analysis of candidate electric propulsion systems employed in LEO to GEO missions. Occultation relations, atmospheric density effects, and natural radiation effects are presented. A solar cell cover glass tradeoff is performed to determine optimum glass thickness. Solar array and spacecraft pointing strategies are described for low altitude flight and for optimum array illumination during ascent. Mass ratio tradeoffs versus transfer time provide direction for thruster technology improvements. Integrated electric propulsion analysis is performed for orbit boosting, inclination change, attitude control, stationkeeping, repositioning, and disposal functions as well as power sharing with payload on orbit. Comparison with chemical auxiliary propulsion is made to quantify the advantages of integrated propulsion in terms of weight savings and concomittant launch cost savings.

  5. Scaling violations of the proton structure function F2 at small x

    NASA Astrophysics Data System (ADS)

    Abt, I.; Ahmed, T.; Andreev, V.; Andrieu, B.; Appuhn, R.-D.; Arpagaus, M.; Babaev, A.; Bärwolff, H.; Bán, J.; Baranov, P.; Barrelet, E.; Bartel, W.; Bassler, U.; Beck, H. P.; Behrend, H.-J.; Belousov, A.; Berger, Ch.; Bergstein, H.; Bernardi, G.; Bernet, R.; Bertrand-Coremans, G.; Besançon, M.; Biddulph, P.; Binder, E.; Bischoff, A.; Bizot, J. C.; Blobel, V.; Borras, K.; Bosetti, P. C.; Boudry, V.; Bourdarios, C.; Brasse, F.; Braun, U.; Braunschweig, W.; Bruncko, D.; Büngener, L.; Bürger, J.; Büsser, F. W.; Buniatian, A.; Burke, S.; Buschhorn, G.; Campbell, A. J.; Carli, T.; Charles, F.; Clarke, D.; Clegg, A. B.; Colombo, M.; Coughlan, J. A.; Courau, A.; Coutures, Ch.; Cozzika, G.; Criegee, L.; Cvach, J.; Dagoret, S.; Dainton, J. B.; Danilov, M.; Dann, A. W. E.; Dau, W. D.; David, M.; Deffur, E.; Delcourt, B.; Del Buono, L.; Devel, M.; De Roeck, A.; Dingus, P.; Dollfus, C.; Dowell, J. D.; Dreis, H. B.; Drescher, A.; Duboc, J.; Düllmann, D.; Dünger, O.; Duhm, H.; Ebbinghaus, R.; Eberle, M.; Ebert, J.; Ebert, T. R.; Eckerlin, G.; Efremenko, V.; Egli, S.; Eichenberger, S.; Eichler, R.; Eisele, F.; Eisenhandler, E.; Ellis, N. N.; Ellison, R. J.; Elsen, E.; Erdmann, M.; Evrard, E.; Favart, L.; Fedotov, A.; Feeken, D.; Felst, R.; Feltesse, J.; Fensome, I. F.; Ferencei, J.; Ferrarotto, F.; Flamm, K.; Flauger, W.; Fleischer, M.; Flieser, M.; Flügge, G.; Fomenko, A.; Fominykh, B.; Forbush, M.; Formánek, J.; Foster, J. M.; Franke, G.; Fretwurst, E.; Fuhrmann, P.; Gabathuler, E.; Gamerdinger, K.; Garvey, J.; Gayler, J.; Gellrich, A.; Gennis, M.; Genzel, H.; Gerhards, R.; Godfrey, L.; Goerlach, U.; Goerlich, L.; Gogitidze, N.; Goldberg, M.; Goodall, A. M.; Gorelov, I.; Goritchev, P.; Grab, C.; Grässler, H.; Grässler, R.; Greenshaw, T.; Greif, H.; Grindhammer, G.; Gruber, C.; Haack, J.; Hajduk, L.; Hamon, O.; Handschuh, D.; Hanlon, E. M.; Hapke, M.; Harjes, J.; Haydar, R.; Haynes, W. J.; Heatherington, J.; Hedberg, V.; Heinzelmann, G.; Henderson, R. C. W.; Henschel, H.; Herma, R.; Herynek, I.; Hildesheim, W.; Hill, P.; Hilton, C. D.; Hladký, J.; Hoeger, K. C.; Huet, Ph.; Hufnagel, H.; Huot, N.; Ibbotson, M.; Itterbeck, H.; Jabiol, M.-A.; Jacholkowska, A.; Jacobsson, C.; Jaffre, M.; Jansen, T.; Jönsson, L.; Johannsen, K.; Johnson, D. P.; Johnson, L.; Jung, H.; Kalmus, P. I. P.; Kasarian, S.; Kaschowitz, R.; Kasselmann, P.; Kathage, U.; Kaufmann, H. H.; Kenyon, I. R.; Kermiche, S.; Keuker, C.; Kiesling, C.; Klein, M.; Kleinwort, C.; Knies, G.; Ko, W.; Köhler, T.; Kolanoski, H.; Kole, F.; Kolya, S. D.; Korbel, V.; Korn, M.; Kostka, P.; Kotelnikov, S. K.; Krasny, M. W.; Krehbiel, H.; Krücker, D.; Krüger, U.; Kubenka, J. P.; Küster, H.; Kuhlen, M.; Kurča, T.; Kurzhöfer, J.; Kuznik, B.; Lacour, D.; Lamarche, F.; Lander, R.; Landon, M. P. J.; Lange, W.; Langkau, R.; Lanius, P.; Laporte, J. F.; Lebedev, A.; Leuschner, A.; Leverenz, C.; Levonian, S.; Lewin, D.; Ley, Ch.; Lindner, A.; Lindström, G.; Linsel, F.; Lipinski, J.; Loch, P.; Lohmander, H.; Lopez, G. C.; Lüers, D.; Magnussen, N.; Malinovski, E.; Mani, S.; Marage, P.; Marks, J.; Marshall, R.; Martens, J.; Martin, R.; Martyn, H.-U.; Martyniak, J.; Masson, S.; Mavroidis, A.; Maxfield, S. J.; McMahon, S. J.; Mehta, A.; Meier, K.; Mercer, D.; Merz, T.; Meyer, C. A.; Meyer, H.; Meyer, J.; Mikocki, S.; Milone, V.; Monnier, E.; Moreau, F.; Moreels, J.; Morris, J. V.; Müller, K.; Murín, P.; Murray, S. A.; Nagovizin, V.; Naroska, B.; Naumann, Th.; Newman, P. R.; Newton, D.; Neyret, D.; Nguyen, H. K.; Niebergall, F.; Niebuhr, C.; Nisius, R.; Nowak, G.; Noyes, G. W.; Nyberg, M.; Oberlack, H.; Obrock, U.; Olsson, J. E.; Orenstein, S.; Ould-Saada, F.; Pascaud, C.; Patel, G. D.; Peppel, E.; Peters, S.; Phillips, H. T.; Phillips, J. P.; Pichler, Ch.; Pilgram, W.; Pitzl, D.; Prell, S.; Prosi, R.; Rädel, G.; Raupach, F.; Rauschnabel, K.; Reimer, P.; Reinshagen, S.; Ribarics, P.; Riech, V.; Riedlberger, J.; Riess, S.; Rietz, M.; Robertson, S. M.; Robmann, P.; Roosen, R.; Rostovtsev, A.; Royon, C.; Rudowicz, M.; Ruffer, M.; Rusakov, S.; Rybicki, K.; Sahlmann, N.; Sanchez, E.; Sankey, D. P. C.; Savitsky, M.; Schacht, P.; Schleper, P.; von Schlippe, W.; Schmidt, C.; Schmidt, D.; Schmitz, W.; Schöning, A.; Schröder, V.; Schulz, M.; Schwab, B.; Schwind, A.; Scobel, W.; Seehausen, U.; Sell, R.; Semenov, A.; Shekelyan, V.; Sheviakov, I.; Shooshtari, H.; Shtarkov, L. N.; Siegmon, G.; Siewert, U.; Sirois, Y.; Skillicorn, I. O.; Smirnov, P.; Smith, J. R.; Smolik, L.; Soloviev, Y.; Spitzer, H.; Staroba, P.; Steenbock, M.; Steffen, P.; Steinberg, R.; Stella, B.; Stephens, K.; Stier, J.; Stösslein, U.; Strachota, J.; Straumann, U.; Struczinski, W.; Sutton, J. P.; Taylor, R. E.; Tchernyshov, V.; Thiebaux, C.; Thompson, G.; Tichomirov, I.; Truöl, P.; Turnau, J.; Tutas, J.; Urban, L.; Usik, A.; Valkar, S.; Valkarova, A.; Vallée, C.; Van Esch, P.; Vartapetian, A.; Vazdik, Y.; Vecko, M.; Verrecchia, P.; Vick, R.; Villet, G.; Vogel, E.; Wacker, K.; Walker, I. W.; Walther, A.; Weber, G.; Wegener, D.; Wegner, A.; Wellisch, H. P.; West, L. R.; Willard, S.; Winde, M.; Winter, G.-G.; Wolff, Th.; Womersley, L. A.; Wright, A. E.; Wulff, N.; Yiou, T. P.; Žáček, J.; Závada, P.; Zeitnitz, C.; Ziaeepour, H.; Zimmer, M.; Zimmermann, W.; Zomer, F.; H1 Collaboration

    1994-01-01

    An analysis is presented of scaling violations of the proton structure function F2( x, Q2) measured with the H1 detector at HERA in the range of Bjorken x values between x = 3 × 10 -4 and 10 -2 for four-momentum transfers Q> 2 larger than 8.7 GeV 2. The structure function F2( x, Q2) is observed to rise linearly with ln Q2. Under the assumption that the observed scaling violations at small x ⩽ 0.01 are described correctly by perturbative QCD, an estimate is obtained of the gluon distribution function G( x, Q02) at Q22 = 20 GeV 2.

  6. Resting state fMRI: A review on methods in resting state connectivity analysis and resting state networks.

    PubMed

    Smitha, K A; Akhil Raja, K; Arun, K M; Rajesh, P G; Thomas, Bejoy; Kapilamoorthy, T R; Kesavadas, Chandrasekharan

    2017-08-01

    The inquisitiveness about what happens in the brain has been there since the beginning of humankind. Functional magnetic resonance imaging is a prominent tool which helps in the non-invasive examination, localisation as well as lateralisation of brain functions such as language, memory, etc. In recent years, there is an apparent shift in the focus of neuroscience research to studies dealing with a brain at 'resting state'. Here the spotlight is on the intrinsic activity within the brain, in the absence of any sensory or cognitive stimulus. The analyses of functional brain connectivity in the state of rest have revealed different resting state networks, which depict specific functions and varied spatial topology. However, different statistical methods have been introduced to study resting state functional magnetic resonance imaging connectivity, yet producing consistent results. In this article, we introduce the concept of resting state functional magnetic resonance imaging in detail, then discuss three most widely used methods for analysis, describe a few of the resting state networks featuring the brain regions, associated cognitive functions and clinical applications of resting state functional magnetic resonance imaging. This review aims to highlight the utility and importance of studying resting state functional magnetic resonance imaging connectivity, underlining its complementary nature to the task-based functional magnetic resonance imaging.

  7. Transforming User Needs into Functional Requirements for an Antibiotic Clinical Decision Support System

    PubMed Central

    Bright, T.J.

    2013-01-01

    Summary Background Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. Objective To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). Methods The approach consisted of five steps: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. Results The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. Conclusion This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains. PMID:24454586

  8. Digital simulation of scalar optical diffraction: revisiting chirp function sampling criteria and consequences.

    PubMed

    Voelz, David G; Roggemann, Michael C

    2009-11-10

    Accurate simulation of scalar optical diffraction requires consideration of the sampling requirement for the phase chirp function that appears in the Fresnel diffraction expression. We describe three sampling regimes for FFT-based propagation approaches: ideally sampled, oversampled, and undersampled. Ideal sampling, where the chirp and its FFT both have values that match analytic chirp expressions, usually provides the most accurate results but can be difficult to realize in practical simulations. Under- or oversampling leads to a reduction in the available source plane support size, the available source bandwidth, or the available observation support size, depending on the approach and simulation scenario. We discuss three Fresnel propagation approaches: the impulse response/transfer function (angular spectrum) method, the single FFT (direct) method, and the two-step method. With illustrations and simulation examples we show the form of the sampled chirp functions and their discrete transforms, common relationships between the three methods under ideal sampling conditions, and define conditions and consequences to be considered when using nonideal sampling. The analysis is extended to describe the sampling limitations for the more exact Rayleigh-Sommerfeld diffraction solution.

  9. Quantitative topographic differentiation of the neonatal EEG.

    PubMed

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  10. Payload Operations Control Center (POCC). [spacelab flight operations

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.; Noneman, S. R.; Terry, E. S.

    1981-01-01

    The Spacelab payload operations control center (POCC) timeline analysis program which is used to provide POCC activity and resource information as a function of mission time is described. This program is fully automated and interactive, and is equipped with tutorial displays. The tutorial displays are sufficiently detailed for use by a program analyst having no computer experience. The POCC timeline analysis program is designed to operate on the VAX/VMS version V2.1 computer system.

  11. Comments on Professor Lortie's Paper Entitled "The Cracked Cake of Educational Custom and Emerging Issues in Evaluation." Center for the Study of Evaluation of Instructional Programs Occasional Report No. 20.

    ERIC Educational Resources Information Center

    Gordon, C. Wayne

    This paper suggests that the variety of decision making proposed by Professor Lortie will not afford the luxury of evaluative systems of the kind he describes. Professor Gordon feels that, had Professor Lortie pursued a line of functional analysis of many outcomes, he would have arrived at an entirely new analysis of the justification for…

  12. An Examination of Expressive Functions in a Constructivist Model of Written Composing.

    ERIC Educational Resources Information Center

    West, Martha Meyer

    Based on constructivist theory (emphasizing meaning generation rather than communication), the study described in this report clarifies and elaborates on the definition of the term "expressive writing." First, the paper offers a rationale for the study and a discussion of the research methodology, which combined analysis of texts and…

  13. Assembly and analysis of changes in transcriptomes of dairy cattle rumen epithelial during lactation and dry periods

    USDA-ARS?s Scientific Manuscript database

    Lactation in dairy cattle is coupled with increased nutrient requirements for milk synthesis. Therefore, dairy cattle metabolism has to adapt to meet lactation-associated challenges and requires major functional adjustments of the rumen and whole digestive system. This report describes the use of ne...

  14. Adult Training Centres--The Trainees and Their Instructors

    ERIC Educational Resources Information Center

    Howard, Mary

    1975-01-01

    The author outlines the general functions, aims, teaching services, and training requirements of adult training centers for the mentally handicapped. She then describes in detail the preparation and use of a skills analysis program, including the provision of additional back-up materials. A lesson in ice cream making is the illustration.…

  15. A Self-Instructional Approach To the Teaching of Enzymology Involving Computer-Based Sequence Analysis and Molecular Modelling.

    ERIC Educational Resources Information Center

    Attwood, Paul V.

    1997-01-01

    Describes a self-instructional assignment approach to the teaching of advanced enzymology. Presents an assignment that offers a means of teaching enzymology to students that exposes them to modern computer-based techniques of analyzing protein structure and relates structure to enzyme function. (JRH)

  16. Objective Observation: A Socially Just Approach to Student Assessment

    ERIC Educational Resources Information Center

    Moineau, Suzanne; Heisler, Lori

    2013-01-01

    The authors describe an activity they developed for teacher candidates that: (1) demonstrated the natural tendency of the brain to engage in subjective analysis of human behavior; (2) instructed them on the difference between subjective and objective processing and the basic neurology underlying these cognitive functions; (3) engaged them in a…

  17. Adam Smith's Pins, Sausage Making and the Funding of College Education

    ERIC Educational Resources Information Center

    Barrett, Ralph V.

    2005-01-01

    Using the language and concepts of economic markets for the purpose of describing and evaluating the function and performance of educational institutions has been a common and growing practice throughout Western industrial societies for many years. The critique of such market analysis also has a long history. Critical assessments of market theory…

  18. Physical Analysis of an Electric Resistor Heating

    ERIC Educational Resources Information Center

    Perea Martins, J. E. M.

    2018-01-01

    This work describes a simple experiment to measure the resistor temperature as a function of the applied power and proves that it is an efficient way to introduce some important physical concepts in classroom, including the Joule's first law, hot-spot temperature, thermal resistance, thermal dissipation constant, time constant and the Newton's law…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    Finding and identifying cryptography is a growing concern in the malware analysis community. In this paper, artificial neural networks are used to classify functional blocks from a disassembled program as being either cryptography related or not. The resulting system, referred to as NNLC (Neural Net for Locating Cryptography) is presented and results of applying this system to various libraries are described.

  20. An Analysis of Educational Policy: Implications for Minority Community Concerns.

    ERIC Educational Resources Information Center

    Harris, J. John, III; Ogle, Terry

    The paper presents a detailed overview of educational policymaking and discusses the need for minority groups to be involved in policy formation. The first section describes the distinguishing characteristics of the main elements of the functions of administration and policymaking process. The second section examines the following three models of…

  1. Media Consumption and Girls Who Want to Have Fun.

    ERIC Educational Resources Information Center

    Peterson, Eric E.

    1987-01-01

    Explores the nature and function of listening to music from the perspective of semiotic phenomenology. Claims that listening to music is a form of media consumption and a habitual practice that inscribes social meanings and organizes pleasure. Describes three stages through which the analysis of listening to music occurs: description, definition,…

  2. Treatment of Challenging Behavior Exhibited by Children with Prenatal Drug Exposure

    ERIC Educational Resources Information Center

    Kurtz, Patricia F.; Chin, Michelle D.; Rush, Karena S.; Dixon, Dennis R.

    2008-01-01

    A large body of literature exists describing the harmful effects of prenatal drug exposure on infant and child development. However, there is a paucity of research examining strategies to ameliorate sequelae such as externalizing behavior problems. In the present study, functional analysis procedures were used to assess challenging behavior…

  3. Foundation, Organization, and Purpose of the National Consortium for Computer-Based Musical Instruction

    ERIC Educational Resources Information Center

    Hofstetter, Fred T.

    1976-01-01

    This paper begins with a look at the present state of computer applications to music education. Instructional systems for instrumental music, music fundamentals, ear-training, set theory, composition, analysis, information retrieval, automated music printing and computer-managed instruction are discussed. The functions of the NCCBMI are described.…

  4. Finalizing the Consultant Effectiveness Scale: An Analysis and Validation of the Characteristics of Effective Consultants.

    ERIC Educational Resources Information Center

    Knoff, Howard M.; Hines, Constance V.; Kromrey, Jeffrey D.

    1995-01-01

    Proposes that as consultation becomes a larger part of the school psychologist's role and function, the need to empirically identify characteristics of effective consultants is increasingly important. Describes the Consultant Effectiveness Scale (CES) and reexamines it with a national sample of school psychologists. Evaluates discriminate validity…

  5. Logit Models for the Analysis of Two-Way Categorical Data

    ERIC Educational Resources Information Center

    Draxler, Clemens

    2011-01-01

    This article discusses the application of logit models for the analyses of 2-way categorical observations. The models described are generalized linear models using the logit link function. One of the models is the Rasch model (Rasch, 1960). The objective is to test hypotheses of marginal and conditional independence between explanatory quantities…

  6. San Diego's Capital Planning Process

    ERIC Educational Resources Information Center

    Lytton, Michael

    2009-01-01

    This article describes San Diego's capital planning process. As part of its capital planning process, the San Diego Unified School District has developed a systematic analysis of functional quality at each of its school sites. The advantage of this approach is that it seeks to develop and apply quantifiable metrics and standards for the more…

  7. Validated Test Method 1314: Liquid-Solid Partitioning as a Function of Liquid-Solid Ratio for Constituents in Solid Materials Using An Up-Flow Percolation Column Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  8. Use of an Academic Library Web Site Search Engine.

    ERIC Educational Resources Information Center

    Fagan, Jody Condit

    2002-01-01

    Describes an analysis of the search engine logs of Southern Illinois University, Carbondale's library to determine how patrons used the site search. Discusses results that showed patrons did not understand the function of the search and explains improvements that were made in the Web site and in online reference services. (Author/LRW)

  9. Micro-computed tomography of pupal metamorphosis in the solitary bee Megachile rotundata

    USDA-ARS?s Scientific Manuscript database

    Insect metamorphosis involves a complex change in form and function, but most of these changes are internal and treated as a black box. In this study, we examined development of the solitary bee, Megachile rotundata, using micro-computed tomography (µCT) and digital volume analysis. We describe deve...

  10. The Memory Mosaic Project and Presentation

    ERIC Educational Resources Information Center

    Smith, Cynthia Duquette

    2015-01-01

    This article describes a unit-length project involving students in the analysis of how public memory is shaped by multiple factors and functions persuasively to influence one's understanding of historical events. This project was designed for an upper-division undergraduate course in Rhetoric and Public Memory, but could be adapted for use in…

  11. Designing User Manuals for the Online Public Access Catalog.

    ERIC Educational Resources Information Center

    Seiden, Peggy; Sullivan, Patricia

    1986-01-01

    Describes the process of developing and revising a brochure to guide library patrons in conducting an author search on an online public access catalog in order to demonstrate the application of four steps in production of a functional document--analysis; planning; development; evaluation, testing, and revision. Three sources are given. (EJS)

  12. Scientific Benchmarks for Guiding Macromolecular Energy Function Improvement

    PubMed Central

    Leaver-Fay, Andrew; O’Meara, Matthew J.; Tyka, Mike; Jacak, Ron; Song, Yifan; Kellogg, Elizabeth H.; Thompson, James; Davis, Ian W.; Pache, Roland A.; Lyskov, Sergey; Gray, Jeffrey J.; Kortemme, Tanja; Richardson, Jane S.; Havranek, James J.; Snoeyink, Jack; Baker, David; Kuhlman, Brian

    2013-01-01

    Accurate energy functions are critical to macromolecular modeling and design. We describe new tools for identifying inaccuracies in energy functions and guiding their improvement, and illustrate the application of these tools to improvement of the Rosetta energy function. The feature analysis tool identifies discrepancies between structures deposited in the PDB and low energy structures generated by Rosetta; these likely arise from inaccuracies in the energy function. The optE tool optimizes the weights on the different components of the energy function by maximizing the recapitulation of a wide range of experimental observations. We use the tools to examine three proposed modifications to the Rosetta energy function: improving the unfolded state energy model (reference energies), using bicubic spline interpolation to generate knowledge based torisonal potentials, and incorporating the recently developed Dunbrack 2010 rotamer library (Shapovalov and Dunbrack, 2011). PMID:23422428

  13. The Fermi Large Area Telescope on Orbit: Event Classification, Instrument Response Functions, and Calibration

    NASA Technical Reports Server (NTRS)

    Ackermann, M.; Ajello, M.; Albert, A.; Allafort, A.; Atwood, W. B.; Axelsson, M.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; hide

    2012-01-01

    The Fermi Large Area Telescope (Fermi-LAT, hereafter LAT), the primary instrument on the Fermi Gamma-ray Space Telescope (Fermi) mission, is an imaging, wide field-of-view, high-energy -ray telescope, covering the energy range from 20 MeV to more than 300 GeV. During the first years of the mission the LAT team has gained considerable insight into the in-flight performance of the instrument. Accordingly, we have updated the analysis used to reduce LAT data for public release as well as the Instrument Response Functions (IRFs), the description of the instrument performance provided for data analysis. In this paper we describe the effects that motivated these updates. Furthermore, we discuss how we originally derived IRFs from Monte Carlo simulations and later corrected those IRFs for discrepancies observed between flight and simulated data. We also give details of the validations performed using flight data and quantify the residual uncertainties in the IRFs. Finally, we describe techniques the LAT team has developed to propagate those uncertainties into estimates of the systematic errors on common measurements such as fluxes and spectra of astrophysical sources.

  14. Special Features of Using Secondary Materials in the Interior Design of Public Dining Establishments

    NASA Astrophysics Data System (ADS)

    Kuznetsova, Irina; Hapchuk, Olena; Lukinov, Vitaly

    2017-10-01

    This article analyses the latest publications studying the use and practical application of secondary resources as raw materials in design. This analysis is based on the list of secondary resources and their applications in interior decoration. In particular, the interiors of public catering enterprises were analysed. Restaurants with different functional purposes that were classified into several categories with specific peculiarities of interior design were identified. This article presents and describes different types of public catering enterprises based on those categories. The interior design of a public catering enterprise is regarded as a considerably complex system. Different types of secondary materials were reviewed to identify the most frequently used materials for interior space design. This article describes the main peculiarities of the use of secondary materials and presents examples of their practical application. The function of secondary materials in the interior design of public catering enterprises were detected and reviewed. On the basis of the analysis, several directions for the practical application of our results in the field of public catering enterprise design were suggested.

  15. An optogenetics- and imaging-assisted simultaneous multiple patch-clamp recording system for decoding complex neural circuits

    PubMed Central

    Wang, Guangfu; Wyskiel, Daniel R; Yang, Weiguo; Wang, Yiqing; Milbern, Lana C; Lalanne, Txomin; Jiang, Xiaolong; Shen, Ying; Sun, Qian-Quan; Zhu, J Julius

    2015-01-01

    Deciphering neuronal circuitry is central to understanding brain function and dysfunction, yet it remains a daunting task. To facilitate the dissection of neuronal circuits, a process requiring functional analysis of synaptic connections and morphological identification of interconnected neurons, we present here a method for stable simultaneous octuple patch-clamp recordings. This method allows physiological analysis of synaptic interconnections among 4–8 simultaneously recorded neurons and/or 10–30 sequentially recorded neurons, and it allows anatomical identification of >85% of recorded interneurons and >99% of recorded principal neurons. We describe how to apply the method to rodent tissue slices; however, it can be used on other model organisms. We also describe the latest refinements and optimizations of mechanics, electronics, optics and software programs that are central to the realization of a combined single- and two-photon microscopy–based, optogenetics- and imaging-assisted, stable, simultaneous quadruple–viguple patch-clamp recording system. Setting up the system, from the beginning of instrument assembly and software installation to full operation, can be completed in 3–4 d. PMID:25654757

  16. Integrative Analysis of Genetic, Genomic, and Phenotypic Data for Ethanol Behaviors: A Network-Based Pipeline for Identifying Mechanisms and Potential Drug Targets.

    PubMed

    Bogenpohl, James W; Mignogna, Kristin M; Smith, Maren L; Miles, Michael F

    2017-01-01

    Complex behavioral traits, such as alcohol abuse, are caused by an interplay of genetic and environmental factors, producing deleterious functional adaptations in the central nervous system. The long-term behavioral consequences of such changes are of substantial cost to both the individual and society. Substantial progress has been made in the last two decades in understanding elements of brain mechanisms underlying responses to ethanol in animal models and risk factors for alcohol use disorder (AUD) in humans. However, treatments for AUD remain largely ineffective and few medications for this disease state have been licensed. Genome-wide genetic polymorphism analysis (GWAS) in humans, behavioral genetic studies in animal models and brain gene expression studies produced by microarrays or RNA-seq have the potential to produce nonbiased and novel insight into the underlying neurobiology of AUD. However, the complexity of such information, both statistical and informational, has slowed progress toward identifying new targets for intervention in AUD. This chapter describes one approach for integrating behavioral, genetic, and genomic information across animal model and human studies. The goal of this approach is to identify networks of genes functioning in the brain that are most relevant to the underlying mechanisms of a complex disease such as AUD. We illustrate an example of how genomic studies in animal models can be used to produce robust gene networks that have functional implications, and to integrate such animal model genomic data with human genetic studies such as GWAS for AUD. We describe several useful analysis tools for such studies: ComBAT, WGCNA, and EW_dmGWAS. The end result of this analysis is a ranking of gene networks and identification of their cognate hub genes, which might provide eventual targets for future therapeutic development. Furthermore, this combined approach may also improve our understanding of basic mechanisms underlying gene x environmental interactions affecting brain functioning in health and disease.

  17. INTEGRATIVE ANALYSIS OF GENETIC, GENOMIC AND PHENOTYPIC DATA FOR ETHANOL BEHAVIORS: A NETWORK-BASED PIPELINE FOR IDENTIFYING MECHANISMS AND POTENTIAL DRUG TARGETS

    PubMed Central

    Bogenpohl, James W.; Mignogna, Kristin M.; Smith, Maren L.; Miles, Michael F.

    2016-01-01

    Complex behavioral traits, such as alcohol abuse, are caused by an interplay of genetic and environmental factors, producing deleterious functional adaptations in the central nervous system. The long-term behavioral consequences of such changes are of substantial cost to both the individual and society. Substantial progress has been made in the last two decades in understanding elements of brain mechanisms underlying responses to ethanol in animal models and risk factors for alcohol use disorder (AUD) in humans. However, treatments for AUD remain largely ineffective and few medications for this disease state have been licensed. Genome-wide genetic polymorphism analysis (GWAS) in humans, behavioral genetic studies in animal models and brain gene expression studies produced by microarrays or RNA-seq have the potential to produce non-biased and novel insight into the underlying neurobiology of AUD. However, the complexity of such information, both statistical and informational, has slowed progress toward identifying new targets for intervention in AUD. This chapter describes one approach for integrating behavioral, genetic, and genomic information across animal model and human studies. The goal of this approach is to identify networks of genes functioning in the brain that are most relevant to the underlying mechanisms of a complex disease such as AUD. We illustrate an example of how genomic studies in animal models can be used to produce robust gene networks that have functional implications, and to integrate such animal model genomic data with human genetic studies such as GWAS for AUD. We describe several useful analysis tools for such studies: ComBAT, WGCNA and EW_dmGWAS. The end result of this analysis is a ranking of gene networks and identification of their cognate hub genes, which might provide eventual targets for future therapeutic development. Furthermore, this combined approach may also improve our understanding of basic mechanisms underlying gene x environmental interactions affecting brain functioning in health and disease. PMID:27933543

  18. Proposed method for determining the thickness of glass in solar collector panels

    NASA Technical Reports Server (NTRS)

    Moore, D. M.

    1980-01-01

    An analytical method was developed for determining the minimum thickness for simply supported, rectangular glass plates subjected to uniform normal pressure environmental loads such as wind, earthquake, snow, and deadweight. The method consists of comparing an analytical prediction of the stress in the glass panel to a glass breakage stress determined from fracture mechanics considerations. Based on extensive analysis using the nonlinear finite element structural analysis program ARGUS, design curves for the structural analysis of simply supported rectangular plates were developed. These curves yield the center deflection, center stress and corner stress as a function of a dimensionless parameter describing the load intensity. A method of estimating the glass breakage stress as a function of a specified failure rate, degree of glass temper, design life, load duration time, and panel size is also presented.

  19. GenoCAD Plant Grammar to Design Plant Expression Vectors for Promoter Analysis.

    PubMed

    Coll, Anna; Wilson, Mandy L; Gruden, Kristina; Peccoud, Jean

    2016-01-01

    With the rapid advances in prediction tools for discovery of new promoters and their cis-elements, there is a need to improve plant expression methodologies in order to facilitate a high-throughput functional validation of these promoters in planta. The promoter-reporter analysis is an indispensible approach for characterization of plant promoters. It requires the design of complex plant expression vectors, which can be challenging. Here, we describe the use of a plant grammar implemented in GenoCAD that will allow the users to quickly design constructs for promoter analysis experiments but also for other in planta functional studies. The GenoCAD plant grammar includes a library of plant biological parts organized in structural categories to facilitate their use and management and a set of rules that guides the process of assembling these biological parts into large constructs.

  20. On 3-D inelastic analysis methods for hot section components. Volume 1: Special finite element models

    NASA Technical Reports Server (NTRS)

    Nakazawa, S.

    1987-01-01

    This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes that permit more accurate and efficient three-dimensional analysis of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. This report is presented in two volumes. Volume 1 describes effort performed under Task 4B, Special Finite Element Special Function Models, while Volume 2 concentrates on Task 4C, Advanced Special Functions Models.

Top