ERIC Educational Resources Information Center
Volkan, Kevin; Simon, Steven R.; Baker, Harley; Todres, I. David
2004-01-01
Problem Statement and Background: While the psychometric properties of Objective Structured Clinical Examinations (OSCEs) have been studied, their latent structures have not been well characterized. This study examines a factor analytic model of a comprehensive OSCE and addresses implications for measurement of clinical performance. Methods: An…
A comprehensive analytical model of rotorcraft aerodynamics and dynamics. Part 2: User's manual
NASA Technical Reports Server (NTRS)
Johnson, W.
1980-01-01
The use of a computer program for a comprehensive analytical model of rotorcraft aerodynamics and dynamics is described. The program calculates the loads and motion of helicopter rotors and airframe. First the trim solution is obtained, then the flutter, flight dynamics, and/or transient behavior can be calculated. Either a new job can be initiated or further calculations can be performed for an old job.
A comprehensive analytical model of rotorcraft aerodynamics and dynamics. Part 3: Program manual
NASA Technical Reports Server (NTRS)
Johnson, W.
1980-01-01
The computer program for a comprehensive analytical model of rotorcraft aerodynamics and dynamics is described. This analysis is designed to calculate rotor performance, loads, and noise; the helicopter vibration and gust response; the flight dynamics and handling qualities; and the system aeroelastic stability. The analysis is a combination of structural, inertial, and aerodynamic models that is applicable to a wide range of problems and a wide class of vehicles. The analysis is intended for use in the design, testing, and evaluation of rotors and rotorcraft and to be a basis for further development of rotary wing theories.
Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.
Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L
2013-01-01
Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.
Comprehensive benefit analysis of regional water resources based on multi-objective evaluation
NASA Astrophysics Data System (ADS)
Chi, Yixia; Xue, Lianqing; Zhang, Hui
2018-01-01
The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.
CH-47D Rotating System Fault Sensing for Condition Based Maintenance
2011-03-01
replacement. This research seeks to create an analytical model in the Rotorcraft Comprehensive Analysis System which will enable the identifica- tion of...answer my many questions. Without your assistance and that of Dr. Jon Keller and Mr. Clayton Kachelle at AMRDEC, the Rotorcraft Comprehensive Analysis...20 3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.2 Rotorcraft Comprehensive Analysis
Analytically tractable climate-carbon cycle feedbacks under 21st century anthropogenic forcing
NASA Astrophysics Data System (ADS)
Lade, Steven J.; Donges, Jonathan F.; Fetzer, Ingo; Anderies, John M.; Beer, Christian; Cornell, Sarah E.; Gasser, Thomas; Norberg, Jon; Richardson, Katherine; Rockström, Johan; Steffen, Will
2018-05-01
Changes to climate-carbon cycle feedbacks may significantly affect the Earth system's response to greenhouse gas emissions. These feedbacks are usually analysed from numerical output of complex and arguably opaque Earth system models. Here, we construct a stylised global climate-carbon cycle model, test its output against comprehensive Earth system models, and investigate the strengths of its climate-carbon cycle feedbacks analytically. The analytical expressions we obtain aid understanding of carbon cycle feedbacks and the operation of the carbon cycle. Specific results include that different feedback formalisms measure fundamentally the same climate-carbon cycle processes; temperature dependence of the solubility pump, biological pump, and CO2 solubility all contribute approximately equally to the ocean climate-carbon feedback; and concentration-carbon feedbacks may be more sensitive to future climate change than climate-carbon feedbacks. Simple models such as that developed here also provide workbenches
for simple but mechanistically based explorations of Earth system processes, such as interactions and feedbacks between the planetary boundaries, that are currently too uncertain to be included in comprehensive Earth system models.
Analysis of Advanced Rotorcraft Configurations
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2000-01-01
Advanced rotorcraft configurations are being investigated with the objectives of identifying vehicles that are larger, quieter, and faster than current-generation rotorcraft. A large rotorcraft, carrying perhaps 150 passengers, could do much to alleviate airport capacity limitations, and a quiet rotorcraft is essential for community acceptance of the benefits of VTOL operations. A fast, long-range, long-endurance rotorcraft, notably the tilt-rotor configuration, will improve rotorcraft economics through productivity increases. A major part of the investigation of advanced rotorcraft configurations consists of conducting comprehensive analyses of vehicle behavior for the purpose of assessing vehicle potential and feasibility, as well as to establish the analytical models required to support the vehicle development. The analytical work of FY99 included applications to tilt-rotor aircraft. Tilt Rotor Aeroacoustic Model (TRAM) wind tunnel measurements are being compared with calculations performed by using the comprehensive analysis tool (Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD 11)). The objective is to establish the wing and wake aerodynamic models that are required for tilt-rotor analysis and design. The TRAM test in the German-Dutch Wind Tunnel (DNW) produced extensive measurements. This is the first test to encompass air loads, performance, and structural load measurements on tilt rotors, as well as acoustic and flow visualization data. The correlation of measurements and calculations includes helicopter-mode operation (performance, air loads, and blade structural loads), hover (performance and air loads), and airplane-mode operation (performance).
Variations on Debris Disks. IV. An Improved Analytical Model for Collisional Cascades
NASA Astrophysics Data System (ADS)
Kenyon, Scott J.; Bromley, Benjamin C.
2017-04-01
We derive a new analytical model for the evolution of a collisional cascade in a thin annulus around a single central star. In this model, r max the size of the largest object changes with time, {r}\\max \\propto {t}-γ , with γ ≈ 0.1-0.2. Compared to standard models where r max is constant in time, this evolution results in a more rapid decline of M d , the total mass of solids in the annulus, and L d , the luminosity of small particles in the annulus: {M}d\\propto {t}-(γ +1) and {L}d\\propto {t}-(γ /2+1). We demonstrate that the analytical model provides an excellent match to a comprehensive suite of numerical coagulation simulations for annuli at 1 au and at 25 au. If the evolution of real debris disks follows the predictions of the analytical or numerical models, the observed luminosities for evolved stars require up to a factor of two more mass than predicted by previous analytical models.
Optimizing an Immersion ESL Curriculum Using Analytic Hierarchy Process
ERIC Educational Resources Information Center
Tang, Hui-Wen Vivian
2011-01-01
The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative…
How health leaders can benefit from predictive analytics.
Giga, Aliyah
2017-11-01
Predictive analytics can support a better integrated health system providing continuous, coordinated, and comprehensive person-centred care to those who could benefit most. In addition to dollars saved, using a predictive model in healthcare can generate opportunities for meaningful improvements in efficiency, productivity, costs, and better population health with targeted interventions toward patients at risk.
ERIC Educational Resources Information Center
Tighe, Elizabeth L.; Schatschneider, Christopher
2016-01-01
The current study employed a meta-analytic approach to investigate the relative importance of component reading skills to reading comprehension in struggling adult readers. A total of 10 component skills were consistently identified across 16 independent studies and 2,707 participants. Random effects models generated 76 predictor-reading…
Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.
Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U
2015-05-01
The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Thermal Effects Modeling Developed for Smart Structures
NASA Technical Reports Server (NTRS)
Lee, Ho-Jun
1998-01-01
Applying smart materials in aeropropulsion systems may improve the performance of aircraft engines through a variety of vibration, noise, and shape-control applications. To facilitate the experimental characterization of these smart structures, researchers have been focusing on developing analytical models to account for the coupled mechanical, electrical, and thermal response of these materials. One focus of current research efforts has been directed toward incorporating a comprehensive thermal analysis modeling capability. Typically, temperature affects the behavior of smart materials by three distinct mechanisms: Induction of thermal strains because of coefficient of thermal expansion mismatch 1. Pyroelectric effects on the piezoelectric elements; 2. Temperature-dependent changes in material properties; and 3. Previous analytical models only investigated the first two thermal effects mechanisms. However, since the material properties of piezoelectric materials generally vary greatly with temperature (see the graph), incorporating temperature-dependent material properties will significantly affect the structural deflections, sensory voltages, and stresses. Thus, the current analytical model captures thermal effects arising from all three mechanisms through thermopiezoelectric constitutive equations. These constitutive equations were incorporated into a layerwise laminate theory with the inherent capability to model both the active and sensory response of smart structures in thermal environments. Corresponding finite element equations were formulated and implemented for both the beam and plate elements to provide a comprehensive thermal effects modeling capability.
Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.
Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R
2016-11-01
Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.
Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.
2016-11-01
Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.
Useful measures and models for analytical quality management in medical laboratories.
Westgard, James O
2016-02-01
The 2014 Milan Conference "Defining analytical performance goals 15 years after the Stockholm Conference" initiated a new discussion of issues concerning goals for precision, trueness or bias, total analytical error (TAE), and measurement uncertainty (MU). Goal-setting models are critical for analytical quality management, along with error models, quality-assessment models, quality-planning models, as well as comprehensive models for quality management systems. There are also critical underlying issues, such as an emphasis on MU to the possible exclusion of TAE and a corresponding preference for separate precision and bias goals instead of a combined total error goal. This opinion recommends careful consideration of the differences in the concepts of accuracy and traceability and the appropriateness of different measures, particularly TAE as a measure of accuracy and MU as a measure of traceability. TAE is essential to manage quality within a medical laboratory and MU and trueness are essential to achieve comparability of results across laboratories. With this perspective, laboratory scientists can better understand the many measures and models needed for analytical quality management and assess their usefulness for practical applications in medical laboratories.
Equivalence and Differences between Structural Equation Modeling and State-Space Modeling Techniques
ERIC Educational Resources Information Center
Chow, Sy-Miin; Ho, Moon-ho R.; Hamaker, Ellen L.; Dolan, Conor V.
2010-01-01
State-space modeling techniques have been compared to structural equation modeling (SEM) techniques in various contexts but their unique strengths have often been overshadowed by their similarities to SEM. In this article, we provide a comprehensive discussion of these 2 approaches' similarities and differences through analytic comparisons and…
Executive Function and Reading Comprehension: A Meta-Analytic Review
ERIC Educational Resources Information Center
Follmer, D. Jake
2018-01-01
This article presents a meta-analytic review of the relation between executive function and reading comprehension. Results (N = 6,673) supported a moderate positive association between executive function and reading comprehension (r = 0.36). Moderator analyses suggested that correlations between executive function and reading comprehension did not…
Kim, Seongho; Jang, Hyejeong; Koo, Imhoi; Lee, Joohyoung; Zhang, Xiang
2017-01-01
Compared to other analytical platforms, comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS) has much increased separation power for analysis of complex samples and thus is increasingly used in metabolomics for biomarker discovery. However, accurate peak detection remains a bottleneck for wide applications of GC×GC-MS. Therefore, the normal-exponential-Bernoulli (NEB) model is generalized by gamma distribution and a new peak detection algorithm using the normal-gamma-Bernoulli (NGB) model is developed. Unlike the NEB model, the NGB model has no closed-form analytical solution, hampering its practical use in peak detection. To circumvent this difficulty, three numerical approaches, which are fast Fourier transform (FFT), the first-order and the second-order delta methods (D1 and D2), are introduced. The applications to simulated data and two real GC×GC-MS data sets show that the NGB-D1 method performs the best in terms of both computational expense and peak detection performance.
Combustion of Nitramine Propellants
1983-03-01
through development of a comprehensive analytical model. The ultimate goals are to enable prediction of deflagration rate over a wide pressure range...superior in burn rate prediction , both simple models fail in correlating existing temperature- sensitivity data. (2) In the second part, a...auxiliary condition to enable independent burn rate prediction ; improved melt phase model including decomposition-gas bubbles; model for far-field
ERIC Educational Resources Information Center
Elleman, Amy M.
2017-01-01
Inference ability is considered central to discourse processing and has been shown to be important across models of reading comprehension. To evaluate the impact of inference instruction, a meta-analysis of 25 inference studies in Grades K-12 was conducted. Results showed that inference instruction was effective for increasing students' general…
ERIC Educational Resources Information Center
Lee, Hyeon Woo
2011-01-01
As the technology-enriched learning environments and theoretical constructs involved in instructional design become more sophisticated and complex, a need arises for equally sophisticated analytic methods to research these environments, theories, and models. Thus, this paper illustrates a comprehensive approach for analyzing data arising from…
Interactive Management and Updating of Spatial Data Bases
NASA Technical Reports Server (NTRS)
French, P.; Taylor, M.
1982-01-01
The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.
Pelaccia, Thierry; Tardif, Jacques; Triby, Emmanuel; Charlin, Bernard
2011-03-14
Clinical reasoning plays a major role in the ability of doctors to make diagnoses and decisions. It is considered as the physician's most critical competence, and has been widely studied by physicians, educationalists, psychologists and sociologists. Since the 1970s, many theories about clinical reasoning in medicine have been put forward. This paper aims at exploring a comprehensive approach: the "dual-process theory", a model developed by cognitive psychologists over the last few years. After 40 years of sometimes contradictory studies on clinical reasoning, the dual-process theory gives us many answers on how doctors think while making diagnoses and decisions. It highlights the importance of physicians' intuition and the high level of interaction between analytical and non-analytical processes. However, it has not received much attention in the medical education literature. The implications of dual-process models of reasoning in terms of medical education will be discussed.
Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon
2015-01-01
Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651
The Effects of Measurement Error on Statistical Models for Analyzing Change. Final Report.
ERIC Educational Resources Information Center
Dunivant, Noel
The results of six major projects are discussed including a comprehensive mathematical and statistical analysis of the problems caused by errors of measurement in linear models for assessing change. In a general matrix representation of the problem, several new analytic results are proved concerning the parameters which affect bias in…
A Study of the Feasibility of Implementing the "CAMPUS" Planning Model.
ERIC Educational Resources Information Center
Keene, T. Wayne
A study was conducted to determine the feasibility of implementing the CAMPUS (Comprehensive Analytical Methods for Planning in University/College Systems) PMS model for planning and resource allocation purposes in the University of South Florida College of Education. A description of CAMPUS PMS was developed, including the nature, output…
ERIC Educational Resources Information Center
Kanter, Jonathan W.; Cautilli, Joseph D.; Busch, Andrew M.; Baruch, David E.
2011-01-01
With recent advances in the behavioral treatment of depression and growing dissatisfaction with medical and cognitive interventions, a resurgence of interest in behavior analytic treatment of depression has occurred. Currently, several behavioral and cognitive behavioral models of depression exist. In reviewing these models, certain agreed upon…
Modeling of vortex generated sound in solid propellant rocket motors
NASA Technical Reports Server (NTRS)
Flandro, G. A.
1980-01-01
There is considerable evidence based on both full scale firings and cold flow simulations that hydrodynamically unstable shear flows in solid propellant rocket motors can lead to acoustic pressure fluctuations of significant amplitude. Although a comprehensive theoretical understanding of this problem does not yet exist, procedures were explored for generating useful analytical models describing the vortex shedding phenomenon and the mechanisms of coupling to the acoustic field in a rocket combustion chamber. Since combustion stability prediction procedures cannot be successful without incorporation of all acoustic gains and losses, it is clear that a vortex driving model comparable in quality to the analytical models currently employed to represent linear combustion instability must be formulated.
Analytic Strategies of Streaming Data for eHealth.
Yoon, Sunmoo
2016-01-01
New analytic strategies for streaming big data from wearable devices and social media are emerging in ehealth. We face challenges to find meaningful patterns from big data because researchers face difficulties to process big volume of streaming data using traditional processing applications.1 This introductory 180 minutes tutorial offers hand-on instruction on analytics2 (e.g., topic modeling, social network analysis) of streaming data. This tutorial aims to provide practical strategies of information on reducing dimensionality using examples of big data. This tutorial will highlight strategies of incorporating domain experts and a comprehensive approach to streaming social media data.
NASA Astrophysics Data System (ADS)
Jiang, Yingni
2018-03-01
Due to the high energy consumption of communication, energy saving of data centers must be enforced. But the lack of evaluation mechanisms has restrained the process on energy saving construction of data centers. In this paper, energy saving evaluation index system of data centers was constructed on the basis of clarifying the influence factors. Based on the evaluation index system, analytical hierarchy process was used to determine the weights of the evaluation indexes. Subsequently, a three-grade fuzzy comprehensive evaluation model was constructed to evaluate the energy saving system of data centers.
Bellows flow-induced vibrations
NASA Technical Reports Server (NTRS)
Tygielski, P. J.; Smyly, H. M.; Gerlach, C. R.
1983-01-01
The bellows flow excitation mechanism and results of comprehensive test program are summarized. The analytical model for predicting bellows flow induced stress is refined. The model includes the effects of an upstream elbow, arbitrary geometry, and multiple piles. A refined computer code for predicting flow induced stress is described which allows life prediction if a material S-N diagram is available.
Orientation Examples Showing Application of the C.A.M.P.U.S. Simulation Model.
ERIC Educational Resources Information Center
Hansen, B. L.; Barron, J. G.
This pamphlet contains information and examples intended to show how the University of Toronto C.A.M.P.U.S. model operates. C.A.M.P.U.S. (Comprehensive Analytical Method for Planning in the University Sphere) is a computer model which processes projected enrollment statistics and other necessary information in such a way as to yield time-based…
Dynamic testing for shuttle design verification
NASA Technical Reports Server (NTRS)
Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.
1972-01-01
Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.
Source-term development for a contaminant plume for use by multimedia risk assessment models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.
1999-12-01
Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less
Pelaccia, Thierry; Tardif, Jacques; Triby, Emmanuel; Charlin, Bernard
2011-01-01
Context Clinical reasoning plays a major role in the ability of doctors to make diagnoses and decisions. It is considered as the physician's most critical competence, and has been widely studied by physicians, educationalists, psychologists and sociologists. Since the 1970s, many theories about clinical reasoning in medicine have been put forward. Purpose This paper aims at exploring a comprehensive approach: the “dual-process theory”, a model developed by cognitive psychologists over the last few years. Discussion After 40 years of sometimes contradictory studies on clinical reasoning, the dual-process theory gives us many answers on how doctors think while making diagnoses and decisions. It highlights the importance of physicians’ intuition and the high level of interaction between analytical and non-analytical processes. However, it has not received much attention in the medical education literature. The implications of dual-process models of reasoning in terms of medical education will be discussed. PMID:21430797
Back-support large laser mirror unit: mounting modeling and analysis
NASA Astrophysics Data System (ADS)
Wang, Hui; Zhang, Zheng; Long, Kai; Liu, Tianye; Li, Jun; Liu, Changchun; Xiong, Zhao; Yuan, Xiaodong
2018-01-01
In high-power laser system, the surface wavefront of large optics has a close link with its structure design and mounting method. The back-support transport mirror design is presently being investigated as a means in China's high-power laser system to hold the optical component firmly while minimizing the distortion of its reflecting surface. We have proposed a comprehensive analytical framework integrated numerical modeling and precise metrology for the mirror's mounting performance evaluation while treating the surface distortion as a key decision variable. The combination of numerical simulation and field tests demonstrates that the comprehensive analytical framework provides a detailed and accurate approach to evaluate the performance of the transport mirror. It is also verified that the back-support transport mirror is effectively compatible with state-of-the-art optical quality specifications. This study will pave the way for future research to solidify the design of back-support large laser optics in China's next generation inertial confinement fusion facility.
[Environmental quality assessment of regional agro-ecosystem in Loess Plateau].
Wang, Limei; Meng, Fanping; Zheng, Jiyong; Wang, Zhonglin
2004-03-01
Based on the detection and analysis of the contamination status of agro-ecosystem with apple-crops intercropping as the dominant cropping model in Loess Plateau, the individual factor and comprehensive environmental quality were assessed by multilevel fuzzy synthetic evaluation model, analytical hierarchy process(AHP), and improved standard weight deciding method. The results showed that the quality of soil, water and agricultural products was grade I, the social economical environmental quality was grade II, the ecological environmental quality was grade III, and the comprehensive environmental quality was grade I. The regional agro-ecosystem dominated by apple-crops intercropping was not the best model for the ecological benefits, but had the better social economical benefits.
Modern data science for analytical chemical data - A comprehensive review.
Szymańska, Ewa
2018-10-22
Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
Intrinsic ethics regarding integrated assessment models for climate management.
Schienke, Erich W; Baum, Seth D; Tuana, Nancy; Davis, Kenneth J; Keller, Klaus
2011-09-01
In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes.
Swarm intelligence metaheuristics for enhanced data analysis and optimization.
Hanrahan, Grady
2011-09-21
The swarm intelligence (SI) computing paradigm has proven itself as a comprehensive means of solving complicated analytical chemistry problems by emulating biologically-inspired processes. As global optimum search metaheuristics, associated algorithms have been widely used in training neural networks, function optimization, prediction and classification, and in a variety of process-based analytical applications. The goal of this review is to provide readers with critical insight into the utility of swarm intelligence tools as methods for solving complex chemical problems. Consideration will be given to algorithm development, ease of implementation and model performance, detailing subsequent influences on a number of application areas in the analytical, bioanalytical and detection sciences.
Evaluation and Prediction of Water Resources Based on AHP
NASA Astrophysics Data System (ADS)
Li, Shuai; Sun, Anqi
2017-01-01
Nowadays, the shortage of water resources is a threat to us. In order to solve the problem of water resources restricted by varieties of factors, this paper establishes a water resources evaluation index model (WREI), which adopts the fuzzy comprehensive evaluation (FCE) based on analytic hierarchy process (AHP) algorithm. After considering influencing factors of water resources, we ignore secondary factors and then hierarchical approach the main factors according to the class, set up a three-layer structure. The top floor is for WREI. Using analytic hierarchy process (AHP) to determine weight first, and then use fuzzy judgment to judge target, so the comprehensive use of the two algorithms reduce the subjective influence of AHP and overcome the disadvantages of multi-level evaluation. To prove the model, we choose India as a target region. On the basis of water resources evaluation index model, we use Matlab and combine grey prediction with linear prediction to discuss the ability to provide clean water in India and the trend of India’s water resources changing in the next 15 years. The model with theoretical support and practical significance will be of great help to provide reliable data support and reference for us to get plans to improve water quality.
Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array
NASA Astrophysics Data System (ADS)
Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng
2016-05-01
Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.
Analytical modeling and numerical simulation of the short-wave infrared electron-injection detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Movassaghi, Yashar; Fathipour, Morteza; Fathipour, Vala
2016-03-21
This paper describes comprehensive analytical and simulation models for the design and optimization of the electron-injection based detectors. The electron-injection detectors evaluated here operate in the short-wave infrared range and utilize a type-II band alignment in InP/GaAsSb/InGaAs material system. The unique geometry of detectors along with an inherent negative-feedback mechanism in the device allows for achieving high internal avalanche-free amplifications without any excess noise. Physics-based closed-form analytical models are derived for the detector rise time and dark current. Our optical gain model takes into account the drop in the optical gain at high optical power levels. Furthermore, numerical simulation studiesmore » of the electrical characteristics of the device show good agreement with our analytical models as well experimental data. Performance comparison between devices with different injector sizes shows that enhancement in the gain and speed is anticipated by reducing the injector size. Sensitivity analysis for the key detector parameters shows the relative importance of each parameter. The results of this study may provide useful information and guidelines for development of future electron-injection based detectors as well as other heterojunction photodetectors.« less
Aeroelastic loads and stability investigation of a full-scale hingeless rotor
NASA Technical Reports Server (NTRS)
Peterson, Randall L.; Johnson, Wayne
1991-01-01
An analytical investigation was conducted to study the influence of various parameters on predicting the aeroelastic loads and stability of a full-scale hingeless rotor in hover and forward flight. The CAMRAD/JA (Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics, Johnson Aeronautics) analysis code is used to obtain the analytical predictions. Data are presented for rotor blade bending and torsional moments as well as inplane damping data obtained for rotor operation in hover at a constant rotor rotational speed of 425 rpm and thrust coefficients between 0.0 and 0.12. Experimental data are presented from a test in the wind tunnel. Validation of the rotor system structural model with experimental rotor blade loads data shows excellent correlation with analytical results. Using this analysis, the influence of different aerodynamic inflow models, the number of generalized blade and body degrees of freedom, and the control-system stiffness at predicted stability levels are shown. Forward flight predictions of the BO-105 rotor system for 1-G thrust conditions at advance ratios of 0.0 to 0.35 are presented. The influence of different aerodynamic inflow models, dynamic inflow models and shaft angle variations on predicted stability levels are shown as a function of advance ratio.
Analytical mesoscale modeling of aeolian sand transport
NASA Astrophysics Data System (ADS)
Lämmel, Marc; Kroy, Klaus
2017-11-01
The mesoscale structure of aeolian sand transport determines a variety of natural phenomena studied in planetary and Earth science. We analyze it theoretically beyond the mean-field level, based on the grain-scale transport kinetics and splash statistics. A coarse-grained analytical model is proposed and verified by numerical simulations resolving individual grain trajectories. The predicted height-resolved sand flux and other important characteristics of the aeolian transport layer agree remarkably well with a comprehensive compilation of field and wind-tunnel data, suggesting that the model robustly captures the essential mesoscale physics. By comparing the predicted saturation length with field data for the minimum sand-dune size, we elucidate the importance of intermittent turbulent wind fluctuations for field measurements and reconcile conflicting previous models for this most enigmatic emergent aeolian scale.
NASA Astrophysics Data System (ADS)
Chen, Jui-Sheng; Li, Loretta Y.; Lai, Keng-Hsin; Liang, Ching-Ping
2017-11-01
A novel solution method is presented which leads to an analytical model for the advective-dispersive transport in a semi-infinite domain involving a wide spectrum of boundary inputs, initial distributions, and zero-order productions. The novel solution method applies the Laplace transform in combination with the generalized integral transform technique (GITT) to obtain the generalized analytical solution. Based on this generalized analytical expression, we derive a comprehensive set of special-case solutions for some time-dependent boundary distributions and zero-order productions, described by the Dirac delta, constant, Heaviside, exponentially-decaying, or periodically sinusoidal functions as well as some position-dependent initial conditions and zero-order productions specified by the Dirac delta, constant, Heaviside, or exponentially-decaying functions. The developed solutions are tested against an analytical solution from the literature. The excellent agreement between the analytical solutions confirms that the new model can serve as an effective tool for investigating transport behaviors under different scenarios. Several examples of applications, are given to explore transport behaviors which are rarely noted in the literature. The results show that the concentration waves resulting from the periodically sinusoidal input are sensitive to dispersion coefficient. The implication of this new finding is that a tracer test with a periodic input may provide additional information when for identifying the dispersion coefficients. Moreover, the solution strategy presented in this study can be extended to derive analytical models for handling more complicated problems of solute transport in multi-dimensional media subjected to sequential decay chain reactions, for which analytical solutions are not currently available.
Comprehensive evaluation of impacts of distributed generation integration in distribution network
NASA Astrophysics Data System (ADS)
Peng, Sujiang; Zhou, Erbiao; Ji, Fengkun; Cao, Xinhui; Liu, Lingshuang; Liu, Zifa; Wang, Xuyang; Cai, Xiaoyu
2018-04-01
All Distributed generation (DG) as the supplement to renewable energy centralized utilization, is becoming the focus of development direction of renewable energy utilization. With the increasing proportion of DG in distribution network, the network power structure, power flow distribution, operation plans and protection are affected to some extent. According to the main impacts of DG, a comprehensive evaluation model of distributed network with DG is proposed in this paper. A comprehensive evaluation index system including 7 aspects, along with their corresponding index calculation method is established for quantitative analysis. The indices under different access capacity of DG in distribution network are calculated based on the IEEE RBTS-Bus 6 system and the evaluation result is calculated by analytic hierarchy process (AHP). The proposed model and method are verified effective and validity through case study.
In-orbit evaluation of the control system/structural mode interactions of the OSO-8 spacecraft
NASA Technical Reports Server (NTRS)
Slafer, L. I.
1979-01-01
The Orbiting Solar Observatory-8 experienced severe structural mode/control loop interaction problems during the spacecraft development. Extensive analytical studies, using the hybrid coordinate modeling approach, and comprehensive ground testing were carried out in order to achieve the system's precision pointing performance requirements. A recent series of flight tests were conducted with the spacecraft in which a wide bandwidth, high resolution telemetry system was utilized to evaluate the on-orbit flexible dynamics characteristics of the vehicle along with the control system performance. The paper describes the results of these tests, reviewing the basic design problem, analytical approach taken, ground test philosophy, and on-orbit testing. Data from the tests was used to determine the primary mode frequency, damping, and servo coupling dynamics for the on-orbit condition. Additionally, the test results have verified analytically predicted differences between the on-orbit and ground test environments, and have led to a validation of both the analytical modeling and servo design techniques used during the development of the control system.
Analytical aspects of plant metabolite profiling platforms: current standings and future aims.
Seger, Christoph; Sturm, Sonja
2007-02-01
Over the past years, metabolic profiling has been established as a comprehensive systems biology tool. Mass spectrometry or NMR spectroscopy-based technology platforms combined with unsupervised or supervised multivariate statistical methodologies allow a deep insight into the complex metabolite patterns of plant-derived samples. Within this review, we provide a thorough introduction to the analytical hard- and software requirements of metabolic profiling platforms. Methodological limitations are addressed, and the metabolic profiling workflow is exemplified by summarizing recent applications ranging from model systems to more applied topics.
Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo
2015-01-01
Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.
Dynamic imaging model and parameter optimization for a star tracker.
Yan, Jinyun; Jiang, Jie; Zhang, Guangjun
2016-03-21
Under dynamic conditions, star spots move across the image plane of a star tracker and form a smeared star image. This smearing effect increases errors in star position estimation and degrades attitude accuracy. First, an analytical energy distribution model of a smeared star spot is established based on a line segment spread function because the dynamic imaging process of a star tracker is equivalent to the static imaging process of linear light sources. The proposed model, which has a clear physical meaning, explicitly reflects the key parameters of the imaging process, including incident flux, exposure time, velocity of a star spot in an image plane, and Gaussian radius. Furthermore, an analytical expression of the centroiding error of the smeared star spot is derived using the proposed model. An accurate and comprehensive evaluation of centroiding accuracy is obtained based on the expression. Moreover, analytical solutions of the optimal parameters are derived to achieve the best performance in centroid estimation. Finally, we perform numerical simulations and a night sky experiment to validate the correctness of the dynamic imaging model, the centroiding error expression, and the optimal parameters.
Comprehensive rotorcraft analysis methods
NASA Technical Reports Server (NTRS)
Stephens, Wendell B.; Austin, Edward E.
1988-01-01
The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).
ERIC Educational Resources Information Center
Murphy, P. Karen; Firetto, Carla M.; Wei, Liwei; Li, Mengyi; Croninger, Rachel M. V.
2016-01-01
Many American students struggle to perform even basic comprehension of text, such as locating information, determining the main idea, or supporting details of a story. Even more students are inadequately prepared to complete more complex tasks, such as critically or analytically interpreting information in text or making reasoned decisions from…
NASA Technical Reports Server (NTRS)
Yeager, W. T., Jr.; Hamouda, M. N. H.; Mantay, W. R.
1983-01-01
A research effort of analysis and testing was conducted to investigate the ground resonance phenomenon of a soft in-plane hingeless rotor. Experimental data were obtained using a 9 ft. (2.74 m) diameter model rotor in hover and forward flight. Eight model rotor configurations were investigated. Configuration parameters included pitch flap coupling, blade sweep and droop, and precone of the blade feathering axis. An analysis based on a comprehensive analytical model of rotorcraft aerodynamics and dynamics was used. The moving block was used to experimentally determine the regressing lead lag mode damping. Good agreement was obtained between the analysis and test. Both analysis and experiment indicated ground resonance instability in hover. An outline of the analysis, a description of the experimental model and procedures, and comparison of the analytical and experimental data are presented.
Comprehensive modeling of a liquid rocket combustion chamber
NASA Technical Reports Server (NTRS)
Liang, P.-Y.; Fisher, S.; Chang, Y. M.
1985-01-01
An analytical model for the simulation of detailed three-phase combustion flows inside a liquid rocket combustion chamber is presented. The three phases involved are: a multispecies gaseous phase, an incompressible liquid phase, and a particulate droplet phase. The gas and liquid phases are continuum described in an Eulerian fashion. A two-phase solution capability for these continuum media is obtained through a marriage of the Implicit Continuous Eulerian (ICE) technique and the fractional Volume of Fluid (VOF) free surface description method. On the other hand, the particulate phase is given a discrete treatment and described in a Lagrangian fashion. All three phases are hence treated rigorously. Semi-empirical physical models are used to describe all interphase coupling terms as well as the chemistry among gaseous components. Sample calculations using the model are given. The results show promising application to truly comprehensive modeling of complex liquid-fueled engine systems.
ERIC Educational Resources Information Center
Piunno, Paul A. E.; Zetina, Adrian; Chu, Norman; Tavares, Anthony J.; Noor, M. Omair; Petryayeva, Eleonora; Uddayasankar, Uvaraj; Veglio, Andrew
2014-01-01
An advanced analytical chemistry undergraduate laboratory module on microfluidics that spans 4 weeks (4 h per week) is presented. The laboratory module focuses on comprehensive experiential learning of microfluidic device fabrication and the core characteristics of microfluidic devices as they pertain to fluid flow and the manipulation of samples.…
Features and characterization needs of rubber composite structures
NASA Technical Reports Server (NTRS)
Tabaddor, Farhad
1989-01-01
Some of the major unique features of rubber composite structures are outlined. The features covered are those related to the material properties, but the analytical features are also briefly discussed. It is essential to recognize these features at the planning stage of any long-range analytical, experimental, or application program. The development of a general and comprehensive program which fully accounts for all the important characteristics of tires, under all the relevant modes of operation, may present a prohibitively expensive and impractical task at the near future. There is therefore a need to develop application methodologies which can utilize the less general models, beyond their theoretical limitations and yet with reasonable reliability, by proper mix of analytical, experimental, and testing activities.
ERIC Educational Resources Information Center
Bernard, Robert M.; Abrami, Philip C.; Wade, Anne; Borokhovski, Evgueni; Lou, Yiping
2004-01-01
Simonson, Schlosser and Hanson (1999) argue that a new theory called "equivalency theory" is needed to account for the unique features of the "teleconferencing" (synchronous) model of DE that is prevalent in many North American universities. Based on a comprehensive meta-analysis of the comparative literature of DE (Bernard,…
ERIC Educational Resources Information Center
McKinley, Jim
2015-01-01
This article makes the argument that we need to situate student's academic writing as socially constructed pieces of writing that embody a writer's cultural identity and critical argument. In support, I present and describe a comprehensive model of an original English as a Foreign Language (EFL) writing analytical framework. This article explains…
An Introduction to Project PRIME and CAMPUS MINNESOTA. Project PRIME Report, Number 2.
ERIC Educational Resources Information Center
Cordes, David C.
PRIME is an acronym for Planning Resources in Minnesota Education. The project's primary objective is to test the implementation of CAMPUS (Comprehensive Analytical Methods for Planning University Systems) in one State College, one Junior College, and in one school at the University of Minnesota. The CAMPUS model was developed by the Institute for…
Comprehensive silicon solar cell computer modeling
NASA Technical Reports Server (NTRS)
Lamorte, M. F.
1984-01-01
The development of an efficient, comprehensive Si solar cell modeling program that has the capability of simulation accuracy of 5 percent or less is examined. A general investigation of computerized simulation is provided. Computer simulation programs are subdivided into a number of major tasks: (1) analytical method used to represent the physical system; (2) phenomena submodels that comprise the simulation of the system; (3) coding of the analysis and the phenomena submodels; (4) coding scheme that results in efficient use of the CPU so that CPU costs are low; and (5) modularized simulation program with respect to structures that may be analyzed, addition and/or modification of phenomena submodels as new experimental data become available, and the addition of other photovoltaic materials.
On-orbit evaluation of the control system/structural mode interactions on OSO-8
NASA Technical Reports Server (NTRS)
Slafer, L. I.
1980-01-01
The Orbiting Solar Observatory-8 experienced severe structural mode/control loop interaction problems during the spacecraft development. Extensive analytical studies, using the hybrid coordinate modeling approach, and comprehensive ground testing were carried out in order to achieve the system's precision pointing performance requirements. A recent series of flight tests were conducted with the spacecraft in which a wide bandwidth, high resolution telemetry system was utilized to evaluate the on-orbit flexible dynamics characteristics of the vehicle along with the control system performance. This paper describes the results of these tests, reviewing the basic design problem, analytical approach taken, ground test philosophy, and on-orbit testing. Data from the tests was used to determine the primary mode frequency, damping, and servo coupling dynamics for the on-orbit condition. Additionally, the test results have verified analytically predicted differences between the on-orbit and ground test environments. The test results have led to a validation of both the analytical modeling and servo design techniques used during the development of the control system, and also verified the approach taken to vehicle and servo ground testing.
Source-term development for a contaminant plume for use by multimedia risk assessment models
NASA Astrophysics Data System (ADS)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.
2000-02-01
Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Yu; Sengupta, Manajit; Dooraghi, Mike
Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less
Xie, Yu; Sengupta, Manajit; Dooraghi, Mike
2018-03-20
Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less
Analytical and experimental vibration analysis of a faulty gear system
NASA Astrophysics Data System (ADS)
Choy, F. K.; Braun, M. J.; Polyshchuk, V.; Zakrajsek, J. J.; Townsend, D. P.; Handschuh, R. F.
1994-10-01
A comprehensive analytical procedure was developed for predicting faults in gear transmission systems under normal operating conditions. A gear tooth fault model is developed to simulate the effects of pitting and wear on the vibration signal under normal operating conditions. The model uses changes in the gear mesh stiffness to simulate the effects of gear tooth faults. The overall dynamics of the gear transmission system is evaluated by coupling the dynamics of each individual gear-rotor system through gear mesh forces generated between each gear-rotor system and the bearing forces generated between the rotor and the gearbox structures. The predicted results were compared with experimental results obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. The Wigner-Ville Distribution (WVD) was used to give a comprehensive comparison of the predicted and experimental results. The WVD method applied to the experimental results were also compared to other fault detection techniques to verify the WVD's ability to detect the pitting damage, and to determine its relative performance. Overall results show good correlation between the experimental vibration data of the damaged test gear and the predicted vibration from the model with simulated gear tooth pitting damage. Results also verified that the WVD method can successfully detect and locate gear tooth wear and pitting damage.
Analytical and experimental vibration analysis of a faulty gear system
NASA Astrophysics Data System (ADS)
Choy, F. K.; Braun, M. J.; Polyshchuk, V.; Zakrajsek, J. J.; Townsend, D. P.; Handschuh, R. F.
1994-10-01
A comprehensive analytical procedure was developed for predicting faults in gear transmission systems under normal operating conditions. A gear tooth fault model is developed to simulate the effects of pitting and wear on the vibration signal under normal operating conditions. The model uses changes in the gear mesh stiffness to simulate the effects of gear tooth faults. The overall dynamics of the gear transmission system is evaluated by coupling the dynamics of each individual gear-rotor system through gear mesh forces generated between each gear-rotor system and the bearing forces generated between the rotor and the gearbox structure. The predicted results were compared with experimental results obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. The Wigner-Ville distribution (WVD) was used to give a comprehensive comparison of the predicted and experimental results. The WVD method applied to the experimental results were also compared to other fault detection techniques to verify the WVD's ability to detect the pitting damage, and to determine its relative performance. Overall results show good correlation between the experimental vibration data of the damaged test gear and the predicted vibration from the model with simulated gear tooth pitting damage. Results also verified that the WVD method can successfully detect and locate gear tooth wear and pitting damage.
Analytical and Experimental Vibration Analysis of a Faulty Gear System
NASA Technical Reports Server (NTRS)
Choy, F. K.; Braun, M. J.; Polyshchuk, V.; Zakrajsek, J. J.; Townsend, D. P.; Handschuh, R. F.
1994-01-01
A comprehensive analytical procedure was developed for predicting faults in gear transmission systems under normal operating conditions. A gear tooth fault model is developed to simulate the effects of pitting and wear on the vibration signal under normal operating conditions. The model uses changes in the gear mesh stiffness to simulate the effects of gear tooth faults. The overall dynamics of the gear transmission system is evaluated by coupling the dynamics of each individual gear-rotor system through gear mesh forces generated between each gear-rotor system and the bearing forces generated between the rotor and the gearbox structure. The predicted results were compared with experimental results obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. The Wigner-Ville distribution (WVD) was used to give a comprehensive comparison of the predicted and experimental results. The WVD method applied to the experimental results were also compared to other fault detection techniques to verify the WVD's ability to detect the pitting damage, and to determine its relative performance. Overall results show good correlation between the experimental vibration data of the damaged test gear and the predicted vibration from the model with simulated gear tooth pitting damage. Results also verified that the WVD method can successfully detect and locate gear tooth wear and pitting damage.
Using a dyadic logistic multilevel model to analyze couple data.
Preciado, Mariana A; Krull, Jennifer L; Hicks, Andrew; Gipson, Jessica D
2016-02-01
There is growing recognition within the sexual and reproductive health field of the importance of incorporating both partners' perspectives when examining sexual and reproductive health behaviors. Yet, the analytical approaches to address couple data have not been readily integrated and utilized within the demographic and public health literature. This paper seeks to provide readers unfamiliar with analytical approaches to couple data an applied example of the use of dyadic logistic multilevel modeling, a useful approach to analyzing couple data to assess the individual, partner and couple characteristics that are related to individuals' reproductively relevant beliefs, attitudes and behaviors. The use of multilevel models in reproductive health research can help researchers develop a more comprehensive picture of the way in which individuals' reproductive health outcomes are situated in a larger relationship and cultural context. Copyright © 2016 Elsevier Inc. All rights reserved.
Ben-David, Avishai; Embury, Janon F; Davidson, Charles E
2006-09-10
A comprehensive analytical radiative transfer model for isothermal aerosols and vapors for passive infrared remote sensing applications (ground-based and airborne sensors) has been developed. The theoretical model illustrates the qualitative difference between an aerosol cloud and a chemical vapor cloud. The model is based on two and two/four stream approximations and includes thermal emission-absorption by the aerosols; scattering of diffused sky radiances incident from all sides on the aerosols (downwelling, upwelling, left, and right); and scattering of aerosol thermal emission. The model uses moderate resolution transmittance ambient atmospheric radiances as boundary conditions and provides analytical expressions for the information on the aerosol cloud that is contained in remote sensing measurements by using thermal contrasts between the aerosols and diffused sky radiances. Simulated measurements of a ground-based sensor viewing Bacillus subtilis var. niger bioaerosols and kaolin aerosols are given and discussed to illustrate the differences between a vapor-only model (i.e., only emission-absorption effects) and a complete model that adds aerosol scattering effects.
NASA Astrophysics Data System (ADS)
Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng
2017-05-01
As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.
Albuquerque De Almeida, Fernando; Al, Maiwenn; Koymans, Ron; Caliskan, Kadir; Kerstens, Ankie; Severens, Johan L
2018-04-01
Describing the general and methodological characteristics of decision-analytical models used in the economic evaluation of early warning systems for the management of chronic heart failure patients and performing a quality assessment of their methodological characteristics is expected to provide concise and useful insight to inform the future development of decision-analytical models in the field of heart failure management. Areas covered: The literature on decision-analytical models for the economic evaluation of early warning systems for the management of chronic heart failure patients was systematically reviewed. Nine electronic databases were searched through the combination of synonyms for heart failure and sensitive filters for cost-effectiveness and early warning systems. Expert commentary: The retrieved models show some variability with regards to their general study characteristics. Overall, they display satisfactory methodological quality, even though some points could be improved, namely on the consideration and discussion of any competing theories regarding model structure and disease progression, identification of key parameters and the use of expert opinion, and uncertainty analyses. A comprehensive definition of early warning systems and further research under this label should be pursued. To improve the transparency of economic evaluation publications, authors should make available detailed technical information regarding the published models.
Roy, Rajarshi; Desai, Jaydev P.
2016-01-01
This paper outlines a comprehensive parametric approach for quantifying mechanical properties of spatially heterogeneous thin biological specimens such as human breast tissue using contact-mode Atomic Force Microscopy. Using inverse finite element (FE) analysis of spherical nanoindentation, the force response from hyperelastic material models is compared with the predicted force response from existing analytical contact models, and a sensitivity study is carried out to assess uniqueness of the inverse FE solution. Furthermore, an automation strategy is proposed to analyze AFM force curves with varying levels of material nonlinearity with minimal user intervention. Implementation of our approach on an elastic map acquired from raster AFM indentation of breast tissue specimens indicates that a judicious combination of analytical and numerical techniques allow more accurate interpretation of AFM indentation data compared to relying on purely analytical contact models, while keeping the computational cost associated an inverse FE solution with reasonable limits. The results reported in this study have several implications in performing unsupervised data analysis on AFM indentation measurements on a wide variety of heterogeneous biomaterials. PMID:25015130
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...
2016-01-28
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Lima, Margarete Maria de; Reibnitz, Kenya Schmidt; Kloh, Daiana; Martini, Jussara Gue; Backes, Vania Marli Schubert
2017-11-27
To analyze how the indications of comprehensiveness translate into the teaching-learning process in a nursing undergraduate course. Qualitative case study carried out with professors of a Nursing Undergraduate Course. Data collection occurred through documentary analysis, non-participant observation and individual interviews. Data analysis was guided from an analytical matrix following the steps of the operative proposal. Eight professors participated in the study. Some indications of comprehensiveness such as dialogue, listening, mutual respect, bonding and welcoming are present in the daily life of some professors. The indications of comprehensiveness are applied by some professors in the pedagogical relationship. The results refer to the Comprehensiveness of teaching-learning in a single and double loop model, and in this the professor and the student assume an open posture for new possibilities in the teaching-learning process. Comprehensiveness, as it is recognized as a pedagogical principle, allows the disruption of a professor-centered teaching and advances in collective learning, enabling the professor and student to create their own design anchored in a reflective process about their practices and the reality found in the health services.
Optimizing an immersion ESL curriculum using analytic hierarchy process.
Tang, Hui-Wen Vivian
2011-11-01
The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative importance of course criteria for the purpose of tailoring an optimal one-week immersion English as a second language (ESL) curriculum for elementary school students in a suburban county of Taiwan. The hierarchy model and AHP analysis utilized in the present study will be useful for resolving several important multi-criteria decision-making issues in planning and evaluating ESL programs. This study also offers valuable insights and provides a basis for further research in customizing ESL curriculum models for different student populations with distinct learning needs, goals, and socioeconomic backgrounds. Copyright © 2011 Elsevier Ltd. All rights reserved.
Influence of Wake Models on Calculated Tiltrotor Aerodynamics
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2001-01-01
The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.
McGinitie, Teague M; Harynuk, James J
2012-09-14
A method was developed to accurately predict both the primary and secondary retention times for a series of alkanes, ketones and alcohols in a flow-modulated GC×GC system. This was accomplished through the use of a three-parameter thermodynamic model where ΔH, ΔS, and ΔC(p) for an analyte's interaction with the stationary phases in both dimensions are known. Coupling this thermodynamic model with a time summation calculation it was possible to accurately predict both (1)t(r) and (2)t(r) for all analytes. The model was able to predict retention times regardless of the temperature ramp used, with an average error of only 0.64% for (1)t(r) and an average error of only 2.22% for (2)t(r). The model shows promise for the accurate prediction of retention times in GC×GC for a wide range of compounds and is able to utilize data collected from 1D experiments. Copyright © 2012 Elsevier B.V. All rights reserved.
Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra
2017-12-01
The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.
Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter
2015-01-01
Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227
2014-10-20
three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael
The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less
NASA Technical Reports Server (NTRS)
Cho, S. Y.; Yetter, R. A.; Dryer, F. L.
1992-01-01
Various chemically reacting flow problems highlighting chemical and physical fundamentals rather than flow geometry are presently investigated by means of a comprehensive mathematical model that incorporates multicomponent molecular diffusion, complex chemistry, and heterogeneous processes, in the interest of obtaining sensitivity-related information. The sensitivity equations were decoupled from those of the model, and then integrated one time-step behind the integration of the model equations, and analytical Jacobian matrices were applied to improve the accuracy of sensitivity coefficients that are calculated together with model solutions.
A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers
Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund; ...
2018-03-28
A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less
A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund
A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less
Thinking through Text Comprehension III: The Programing of Verbal and Investigative Repertoires
ERIC Educational Resources Information Center
Leon, Marta; Layng, T. V. Joe; Sota, Melinda
2011-01-01
Reading comprehension can be considered a complex human performance involving two integrated repertoires: a verbal repertoire and an investigative (generative) repertoire. The analytical and reasoning skills necessary to demonstrate reading comprehension can be systematically taught by analyzing the verbal and investigative repertoires involved…
Tighe, Elizabeth L; Schatschneider, Christopher
2016-07-01
The current study employed a meta-analytic approach to investigate the relative importance of component reading skills to reading comprehension in struggling adult readers. A total of 10 component skills were consistently identified across 16 independent studies and 2,707 participants. Random effects models generated 76 predictor-reading comprehension effect sizes among the 10 constructs. The results indicated that six of the component skills exhibited strong relationships with reading comprehension (average rs ≥ .50): morphological awareness, language comprehension, fluency, oral vocabulary knowledge, real word decoding, and working memory. Three of the component skills yielded moderate relationships with reading comprehension (average rs ≥ .30 and < .50): pseudoword decoding, orthographic knowledge, and phonological awareness. Rapid automatized naming (RAN) was the only component skill that was weakly related to reading comprehension (r = .15). Morphological awareness was a significantly stronger correlate of reading comprehension than phonological awareness and RAN. This study provides the first attempt at a systematic synthesis of the recent research investigating the reading skills of adults with low literacy skills, a historically understudied population. Directions for future research, the relation of our results to the children's literature, and the implications for researchers and adult basic education programs are discussed. © Hammill Institute on Disabilities 2014.
Tighe, Elizabeth L.; Schatschneider, Christopher
2015-01-01
The current study employed a meta-analytic approach to investigate the relative importance of component reading skills to reading comprehension in struggling adult readers. A total of 10 component skills were consistently identified across 16 independent studies and 2,707 participants. Random effects models generated 76 predictor-reading comprehension effect sizes among the 10 constructs. The results indicated that six of the component skills exhibited strong relationships with reading comprehension (average rs ≥ .50): morphological awareness, language comprehension, fluency, oral vocabulary knowledge, real word decoding, and working memory. Three of the component skills yielded moderate relationships with reading comprehension (average rs ≥ .30 and < .50): pseudoword decoding, orthographic knowledge, and phonological awareness. Rapid automatized naming (RAN) was the only component skill that was weakly related to reading comprehension (r = .15). Morphological awareness was a significantly stronger correlate of reading comprehension than phonological awareness and RAN. This study provides the first attempt at a systematic synthesis of the recent research investigating the reading skills of adults with low literacy skills, a historically under-studied population. Directions for future research, the relation of our results to the children’s literature, and the implications for researchers and Adult Basic Education (ABE) programs are discussed. PMID:25350926
A comprehensive model of ion diffusion and charge exchange in the cold Io torus
NASA Technical Reports Server (NTRS)
Barbosa, D. D.; Moreno, M. A.
1988-01-01
A comprehensive analytic model of radial diffusion in the cold Io torus is developed. The model involves a generalized molecular cloud theory of SO2 and its dissociation fragments SO, O2, S, and O, which are formed at a relatively large rate by solar UV photodissociation of SO2. The key component of the new theory is SO, which can react with S(+) through a near-resonant charge exchange process that is exothermic. This provides a mechanism for the rapid depletion of singly ionized sulfur in the cold torus and can account for the large decrease in the total flux tube content inward of Io's orbit. The model is used to demonstrate quantitatively the effects of radial diffusion in a charge exchange environment that acts as a combined source and sink for ions in various charge states. A detailed quantitative explanation for the O(2+) component of the cold torus is given, and insight is derived into the workings of the so-called plasma 'ribbon'.
A Concept Analysis of Holistic Care by Hybrid Model.
Jasemi, Madineh; Valizadeh, Leila; Zamanzadeh, Vahid; Keogh, Brian
2017-01-01
Even though holistic care has been widely discussed in the health care and professional nursing literature, there is no comprehensive definition of it. Therefore, the aim of this article is to present a concept analysis of holistic care which was developed using the hybrid model. The hybrid model comprises three phases. In the theoretical phase, characteristics of holistic care were identified through a review of the literature from CINAHL, MEDLINE, PubMed, OVID, and Google Scholar databases. During the fieldwork phase, in-depth interviews were conducted with eight nurses who were purposely selected. Finally, following an analysis of the literature and the qualitative interviews, a theoretical description of the concept of holistic care was extracted. Two main themes were extracted of analytical phase: "Holistic care for offering a comprehensive model for caring" and "holistic care for improving patients' and nurses' conditions." By undertaking a conceptual analysis of holistic care, its meaning can be clarified which will encourage nursing educators to include holistic care in nursing syllabi, and consequently facilitate its provision in practice.
Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong
2016-01-01
As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.
Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong
2016-04-01
As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.
A Holistic Management Architecture for Large-Scale Adaptive Networks
2007-09-01
transmission and processing overhead required for management. The challenges of building models to describe dynamic systems are well-known to the field of...increases the challenge of finding a simple approach to assessing the state of the network. Moreover, the performance state of one network link may be... challenging . These obstacles indicate the need for a less comprehensive-analytical, more systemic-holistic approach to managing networks. This approach might
Assessing Analytical Similarity of Proposed Amgen Biosimilar ABP 501 to Adalimumab.
Liu, Jennifer; Eris, Tamer; Li, Cynthia; Cao, Shawn; Kuhns, Scott
2016-08-01
ABP 501 is being developed as a biosimilar to adalimumab. Comprehensive comparative analytical characterization studies have been conducted and completed. The objective of this study was to assess analytical similarity between ABP 501 and two adalimumab reference products (RPs), licensed by the United States Food and Drug Administration (adalimumab [US]) and authorized by the European Union (adalimumab [EU]), using state-of-the-art analytical methods. Comprehensive analytical characterization incorporating orthogonal analytical techniques was used to compare products. Physicochemical property comparisons comprised the primary structure related to amino acid sequence and post-translational modifications including glycans; higher-order structure; primary biological properties mediated by target and receptor binding; product-related substances and impurities; host-cell impurities; general properties of the finished drug product, including strength and formulation; subvisible and submicron particles and aggregates; and forced thermal degradation. ABP 501 had the same amino acid sequence and similar post-translational modification profiles compared with adalimumab RPs. Primary structure, higher-order structure, and biological activities were similar for the three products. Product-related size and charge variants and aggregate and particle levels were also similar. ABP 501 had very low residual host-cell protein and DNA. The finished ABP 501 drug product has the same strength with regard to protein concentration and fill volume as adalimumab RPs. ABP 501 and the RPs had a similar stability profile both in normal storage and thermal stress conditions. Based on the comprehensive analytical similarity assessment, ABP 501 was found to be similar to adalimumab with respect to physicochemical and biological properties.
Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.
Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis
2016-07-01
Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.
Aerodynamic and acoustic test of a United Technologies model scale rotor at DNW
NASA Technical Reports Server (NTRS)
Yu, Yung H.; Liu, Sandy R.; Jordan, Dave E.; Landgrebe, Anton J.; Lorber, Peter F.; Pollack, Michael J.; Martin, Ruth M.
1990-01-01
The UTC model scale rotors, the DNW wind tunnel, the AFDD rotary wing test stand, the UTRC and AFDD aerodynamic and acoustic data acquisition systems, and the scope of test matrices are discussed and an introduction to the test results is provided. It is pointed out that a comprehensive aero/acoustic database of several configurations of the UTC scaled model rotor has been created. The data is expected to improve understanding of rotor aerodynamics, acoustics, and dynamics, and lead to enhanced analytical methodology and design capabilities for the next generation of rotorcraft.
Using learning analytics to evaluate a video-based lecture series.
Lau, K H Vincent; Farooque, Pue; Leydon, Gary; Schwartz, Michael L; Sadler, R Mark; Moeller, Jeremy J
2018-01-01
The video-based lecture (VBL), an important component of the flipped classroom (FC) and massive open online course (MOOC) approaches to medical education, has primarily been evaluated through direct learner feedback. Evaluation may be enhanced through learner analytics (LA) - analysis of quantitative audience usage data generated by video-sharing platforms. We applied LA to an experimental series of ten VBLs on electroencephalography (EEG) interpretation, uploaded to YouTube in the model of a publicly accessible MOOC. Trends in view count; total percentage of video viewed and audience retention (AR) (percentage of viewers watching at a time point compared to the initial total) were examined. The pattern of average AR decline was characterized using regression analysis, revealing a uniform linear decline in viewership for each video, with no evidence of an optimal VBL length. Segments with transient increases in AR corresponded to those focused on core concepts, indicative of content requiring more detailed evaluation. We propose a model for applying LA at four levels: global, series, video, and feedback. LA may be a useful tool in evaluating a VBL series. Our proposed model combines analytics data and learner self-report for comprehensive evaluation.
Micromechanics Analysis Code (MAC) User Guide: Version 1.0
NASA Technical Reports Server (NTRS)
Wilt, T. E.; Arnold, S. M.
1994-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triple ply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control), and thermomechanical load histories can be imposed; (2) different integration algorithms may be selected; (3) a variety of constituent constitutive models may be utilized and/or implemented; and (4) a variety of fiber architectures may be easily accessed through their corresponding representative volume elements.
Micromechanics Analysis Code (MAC). User Guide: Version 2.0
NASA Technical Reports Server (NTRS)
Wilt, T. E.; Arnold, S. M.
1996-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code's (MAC) who's predictive capability rests entirely upon the fully analytical generalized method of cells (GMC), micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, (2) different integration algorithms may be selected, (3) a variety of constituent constitutive models may be utilized and/or implemented, and (4) a variety of fiber and laminate architectures may be easily accessed through their corresponding representative volume elements.
A Dynamic Calibration Method for Experimental and Analytical Hub Load Comparison
NASA Technical Reports Server (NTRS)
Kreshock, Andrew R.; Thornburgh, Robert P.; Wilbur, Matthew L.
2017-01-01
This paper presents the results from an ongoing effort to produce improved correlation between analytical hub force and moment prediction and those measured during wind-tunnel testing on the Aeroelastic Rotor Experimental System (ARES), a conventional rotor testbed commonly used at the Langley Transonic Dynamics Tunnel (TDT). A frequency-dependent transformation between loads at the rotor hub and outputs of the testbed balance is produced from frequency response functions measured during vibration testing of the system. The resulting transformation is used as a dynamic calibration of the balance to transform hub loads predicted by comprehensive analysis into predicted balance outputs. In addition to detailing the transformation process, this paper also presents a set of wind-tunnel test cases, with comparisons between the measured balance outputs and transformed predictions from the comprehensive analysis code CAMRAD II. The modal response of the testbed is discussed and compared to a detailed finite-element model. Results reveal that the modal response of the testbed exhibits a number of characteristics that make accurate dynamic balance predictions challenging, even with the use of the balance transformation.
Makarov, Sergey N.; Yanamadala, Janakinadh; Piazza, Matthew W.; Helderman, Alex M.; Thang, Niang S.; Burnham, Edward H.; Pascual-Leone, Alvaro
2016-01-01
Goals Transcranial magnetic stimulation (TMS) is increasingly used as a diagnostic and therapeutic tool for numerous neuropsychiatric disorders. The use of TMS might cause whole-body exposure to undesired induced currents in patients and TMS operators. The aim of the present study is to test and justify a simple analytical model known previously, which may be helpful as an upper estimate of eddy current density at a particular distant observation point for any body composition and any coil setup. Methods We compare the analytical solution with comprehensive adaptive mesh refinement-based FEM simulations of a detailed full-body human model, two coil types, five coil positions, about 100,000 observation points, and two distinct pulse rise times, thus providing a representative number of different data sets for comparison, while also using other numerical data. Results Our simulations reveal that, after a certain modification, the analytical model provides an upper estimate for the eddy current density at any location within the body. In particular, it overestimates the peak eddy currents at distant locations from a TMS coil by a factor of 10 on average. Conclusion The simple analytical model tested in the present study may be valuable as a rapid method to safely estimate levels of TMS currents at different locations within a human body. Significance At present, safe limits of general exposure to TMS electric and magnetic fields are an open subject, including fetal exposure for pregnant women. PMID:26685221
Further Results of Soft-Inplane Tiltrotor Aeromechanics Investigation Using Two Multibody Analyses
NASA Technical Reports Server (NTRS)
Masarati, Pierangelo; Quaranta, Giuseppe; Piatak, David J.; Singleton, Jeffrey D.
2004-01-01
This investigation focuses on the development of multibody analytical models to predict the dynamic response, aeroelastic stability, and blade loading of a soft-inplane tiltrotor wind-tunnel model. Comprehensive rotorcraft-based multibody analyses enable modeling of the rotor system to a high level of detail such that complex mechanics and nonlinear effects associated with control system geometry and joint deadband may be considered. The influence of these and other nonlinear effects on the aeromechanical behavior of the tiltrotor model are examined. A parametric study of the design parameters which may have influence on the aeromechanics of the soft-inplane rotor system are also included in this investigation.
Initial study of thermal energy storage in unconfined aquifers. [UCATES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haitjema, H.M.; Strack, O.D.L.
1986-04-01
Convective heat transport in unconfined aquifers is modeled in a semi-analytic way. The transient groundwater flow is modeled by superposition of analytic functions, whereby changes in the aquifer storage are represented by a network of triangles, each with a linearly varying sink distribution. This analytic formulation incorporates the nonlinearity of the differential equation for unconfined flow and eliminates numerical dispersion in modeling heat convection. The thermal losses through the aquifer base and vadose zone are modeled rather crudely. Only vertical heat conduction is considered in these boundaries, whereby a linearly varying temperature is assumed at all times. The latter assumptionmore » appears reasonable for thin aquifer boundaries. However, assuming such thin aquifer boundaries may lead to an overestimation of the thermal losses when the aquifer base is regarded as infinitely thick in reality. The approach is implemented in the computer program UCATES, which serves as a first step toward the development of a comprehensive screening tool for ATES systems in unconfined aquifers. In its present form, the program is capable of predicting the relative effects of regional flow on the efficiency of ATES systems. However, only after a more realistic heatloss mechanism is incorporated in UCATES will reliable predictions of absolute ATES efficiencies be possible.« less
Comprehensive analytical model for locally contacted rear surface passivated solar cells
NASA Astrophysics Data System (ADS)
Wolf, Andreas; Biro, Daniel; Nekarda, Jan; Stumpp, Stefan; Kimmerle, Achim; Mack, Sebastian; Preu, Ralf
2010-12-01
For optimum performance of solar cells featuring a locally contacted rear surface, the metallization fraction as well as the size and distribution of the local contacts are crucial, since Ohmic and recombination losses have to be balanced. In this work we present a set of equations which enable to calculate this trade off without the need of numerical simulations. Our model combines established analytical and empirical equations to predict the energy conversion efficiency of a locally contacted device. For experimental verification, we fabricate devices from float zone silicon wafers of different resistivity using the laser fired contact technology for forming the local rear contacts. The detailed characterization of test structures enables the determination of important physical parameters, such as the surface recombination velocity at the contacted area and the spreading resistance of the contacts. Our analytical model reproduces the experimental results very well and correctly predicts the optimum contact spacing without the use of free fitting parameters. We use our model to estimate the optimum bulk resistivity for locally contacted devices fabricated from conventional Czochralski-grown silicon material. These calculations use literature values for the stable minority carrier lifetime to account for the bulk recombination caused by the formation of boron-oxygen complexes under carrier injection.
Bridging analytical approaches for low-carbon transitions
NASA Astrophysics Data System (ADS)
Geels, Frank W.; Berkhout, Frans; van Vuuren, Detlef P.
2016-06-01
Low-carbon transitions are long-term multi-faceted processes. Although integrated assessment models have many strengths for analysing such transitions, their mathematical representation requires a simplification of the causes, dynamics and scope of such societal transformations. We suggest that integrated assessment model-based analysis should be complemented with insights from socio-technical transition analysis and practice-based action research. We discuss the underlying assumptions, strengths and weaknesses of these three analytical approaches. We argue that full integration of these approaches is not feasible, because of foundational differences in philosophies of science and ontological assumptions. Instead, we suggest that bridging, based on sequential and interactive articulation of different approaches, may generate a more comprehensive and useful chain of assessments to support policy formation and action. We also show how these approaches address knowledge needs of different policymakers (international, national and local), relate to different dimensions of policy processes and speak to different policy-relevant criteria such as cost-effectiveness, socio-political feasibility, social acceptance and legitimacy, and flexibility. A more differentiated set of analytical approaches thus enables a more differentiated approach to climate policy making.
Validation of a common data model for active safety surveillance research
Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E
2011-01-01
Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893
A six-parameter Iwan model and its application
NASA Astrophysics Data System (ADS)
Li, Yikun; Hao, Zhiming
2016-02-01
Iwan model is a practical tool to describe the constitutive behaviors of joints. In this paper, a six-parameter Iwan model based on a truncated power-law distribution with two Dirac delta functions is proposed, which gives a more comprehensive description of joints than the previous Iwan models. Its analytical expressions including backbone curve, unloading curves and energy dissipation are deduced. Parameter identification procedures and the discretization method are also provided. A model application based on Segalman et al.'s experiment works with bolted joints is carried out. Simulation effects of different numbers of Jenkins elements are discussed. The results indicate that the six-parameter Iwan model can be used to accurately reproduce the experimental phenomena of joints.
ERIC Educational Resources Information Center
Bodily, Robert; Verbert, Katrien
2017-01-01
This article is a comprehensive literature review of student-facing learning analytics reporting systems that track learning analytics data and report it directly to students. This literature review builds on four previously conducted literature reviews in similar domains. Out of the 945 articles retrieved from databases and journals, 93 articles…
Progress and development of analytical methods for gibberellins.
Pan, Chaozhi; Tan, Swee Ngin; Yong, Jean Wan Hong; Ge, Liya
2017-01-01
Gibberellins, as a group of phytohormones, exhibit a wide variety of bio-functions within plant growth and development, which have been used to increase crop yields. Many analytical procedures, therefore, have been developed for the determination of the types and levels of endogenous and exogenous gibberellins. As plant tissues contain gibberellins in trace amounts (usually at the level of nanogram per gram fresh weight or even lower), the sample pre-treatment steps (extraction, pre-concentration, and purification) for gibberellins are reviewed in details. The primary focus of this comprehensive review is on the various analytical methods designed to meet the requirements for gibberellins analyses in complex matrices with particular emphasis on high-throughput analytical methods, such as gas chromatography, liquid chromatography, and capillary electrophoresis, mostly combined with mass spectrometry. The advantages and drawbacks of the each described analytical method are discussed. The overall aim of this review is to provide a comprehensive and critical view on the different analytical methods nowadays employed to analyze gibberellins in complex sample matrices and their foreseeable trends. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chen, Chen; Schneps, Matthew H; Masyn, Katherine E; Thomson, Jennifer M
2016-11-01
Increasing evidence has shown visual attention span to be a factor, distinct from phonological skills, that explains single-word identification (pseudo-word/word reading) performance in dyslexia. Yet, little is known about how well visual attention span explains text comprehension. Observing reading comprehension in a sample of 105 high school students with dyslexia, we used a pathway analysis to examine the direct and indirect path between visual attention span and reading comprehension while controlling for other factors such as phonological awareness, letter identification, short-term memory, IQ and age. Integrating phonemic decoding efficiency skills in the analytic model, this study aimed to disentangle how visual attention span and phonological skills work together in reading comprehension for readers with dyslexia. We found visual attention span to have a significant direct effect on more difficult reading comprehension but not on an easier level. It also had a significant direct effect on pseudo-word identification but not on word identification. In addition, we found that visual attention span indirectly explains reading comprehension through pseudo-word reading and word reading skills. This study supports the hypothesis that at least part of the dyslexic profile can be explained by visual attention abilities. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Tan, Nicholas X.; Rydzak, Chara; Yang, Li-Gang; Vickerman, Peter; Yang, Bin; Peeling, Rosanna W.; Hawkes, Sarah; Chen, Xiang-Sheng; Tucker, Joseph D.
2013-01-01
Background Syphilis is a major public health problem in many regions of China, with increases in congenital syphilis (CS) cases causing concern. The Chinese Ministry of Health recently announced a comprehensive 10-y national syphilis control plan focusing on averting CS. The decision analytic model presented here quantifies the impact of the planned strategies to determine whether they are likely to meet the goals laid out in the control plan. Methods and Findings Our model incorporated data on age-stratified fertility, female adult syphilis cases, and empirical syphilis transmission rates to estimate the number of CS cases associated with prenatal syphilis infection on a yearly basis. Guangdong Province was the focus of this analysis because of the availability of high-quality demographic and public health data. Each model outcome was simulated 1,000 times to incorporate uncertainty in model inputs. The model was validated using data from a CS intervention program among 477,656 women in China. Sensitivity analyses were performed to identify which variables are likely to be most influential in achieving Chinese and international policy goals. Increasing prenatal screening coverage was the single most effective strategy for reducing CS cases. An incremental increase in prenatal screening from the base case of 57% coverage to 95% coverage was associated with 106 (95% CI: 101, 111) CS cases averted per 100,000 live births (58% decrease). The policy strategies laid out in the national plan led to an outcome that fell short of the target, while a four-pronged comprehensive syphilis control strategy consisting of increased prenatal screening coverage, increased treatment completion, earlier prenatal screening, and improved syphilis test characteristics was associated with 157 (95% CI: 154, 160) CS cases averted per 100,000 live births (85% decrease). Conclusions The Chinese national plan provides a strong foundation for syphilis control, but more comprehensive measures that include earlier and more extensive screening are necessary for reaching policy goals. Please see later in the article for the Editors' Summary PMID:23349624
Towards a comprehensive city emission function (CCEF)
NASA Astrophysics Data System (ADS)
Kocifaj, Miroslav
2018-01-01
The comprehensive city emission function (CCEF) is developed for a heterogeneous light-emitting or blocking urban environments, embracing any combination of input parameters that characterize linear dimensions in the system (size and distances between buildings or luminaires), properties of light-emitting elements (such as luminous building façades and street lighting), ground reflectance and total uplight-fraction, all of these defined for an arbitrarily sized 2D area. The analytical formula obtained is not restricted to a single model class as it can capture any specific light-emission feature for wide range of cities. The CCEF method is numerically fast in contrast to what can be expected of other probabilistic approaches that rely on repeated random sampling. Hence the present solution has great potential in light-pollution modeling and can be included in larger numerical models. Our theoretical findings promise great progress in light-pollution modeling as this is the first time an analytical solution to city emission function (CEF) has been developed that depends on statistical mean size and height of city buildings, inter-building separation, prevailing heights of light fixtures, lighting density, and other factors such as e.g. luminaire light output and light distribution, including the amount of uplight, and representative city size. The model is validated for sensitivity and specificity pertinent to combinations of input parameters in order to test its behavior under various conditions, including those that can occur in complex urban environments. It is demonstrated that the solution model succeeds in reproducing a light emission peak at some elevated zenith angles and is consistent with reduced rather than enhanced emission in directions nearly parallel to the ground.
Assessment of water droplet evaporation mechanisms on hydrophobic and superhydrophobic substrates.
Pan, Zhenhai; Dash, Susmita; Weibel, Justin A; Garimella, Suresh V
2013-12-23
Evaporation rates are predicted and important transport mechanisms identified for evaporation of water droplets on hydrophobic (contact angle ~110°) and superhydrophobic (contact angle ~160°) substrates. Analytical models for droplet evaporation in the literature are usually simplified to include only vapor diffusion in the gas domain, and the system is assumed to be isothermal. In the comprehensive model developed in this study, evaporative cooling of the interface is accounted for, and vapor concentration is coupled to local temperature at the interface. Conjugate heat and mass transfer are solved in the solid substrate, liquid droplet, and surrounding gas. Buoyancy-driven convective flows in the droplet and vapor domains are also simulated. The influences of evaporative cooling and convection on the evaporation characteristics are determined quantitatively. The liquid-vapor interface temperature drop induced by evaporative cooling suppresses evaporation, while gas-phase natural convection acts to enhance evaporation. While the effects of these competing transport mechanisms are observed to counterbalance for evaporation on a hydrophobic surface, the stronger influence of evaporative cooling on a superhydrophobic surface accounts for an overprediction of experimental evaporation rates by ~20% with vapor diffusion-based models. The local evaporation fluxes along the liquid-vapor interface for both hydrophobic and superhydrophobic substrates are investigated. The highest local evaporation flux occurs at the three-phase contact line region due to proximity to the higher temperature substrate, rather than at the relatively colder droplet top; vapor diffusion-based models predict the opposite. The numerically calculated evaporation rates agree with experimental results to within 2% for superhydrophobic substrates and 3% for hydrophobic substrates. The large deviations between past analytical models and the experimental data are therefore reconciled with the comprehensive model developed here.
Sensitivity analysis for high-contrast missions with segmented telescopes
NASA Astrophysics Data System (ADS)
Leboulleux, Lucie; Sauvage, Jean-François; Pueyo, Laurent; Fusco, Thierry; Soummer, Rémi; N'Diaye, Mamadou; St. Laurent, Kathryn
2017-09-01
Segmented telescopes enable large-aperture space telescopes for the direct imaging and spectroscopy of habitable worlds. However, the increased complexity of their aperture geometry, due to their central obstruction, support structures, and segment gaps, makes high-contrast imaging very challenging. In this context, we present an analytical model that will enable to establish a comprehensive error budget to evaluate the constraints on the segments and the influence of the error terms on the final image and contrast. Indeed, the target contrast of 1010 to image Earth-like planets requires drastic conditions, both in term of segment alignment and telescope stability. Despite space telescopes evolving in a more friendly environment than ground-based telescopes, remaining vibrations and resonant modes on the segments can still deteriorate the contrast. In this communication, we develop and validate the analytical model, and compare its outputs to images issued from end-to-end simulations.
A comprehensive risk analysis of coastal zones in China
NASA Astrophysics Data System (ADS)
Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying
2014-03-01
Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.
Development of PARMA: PHITS-based analytical radiation model in the atmosphere.
Sato, Tatsuhiko; Yasuda, Hiroshi; Niita, Koji; Endo, Akira; Sihver, Lembit
2008-08-01
Estimation of cosmic-ray spectra in the atmosphere has been essential for the evaluation of aviation doses. We therefore calculated these spectra by performing Monte Carlo simulation of cosmic-ray propagation in the atmosphere using the PHITS code. The accuracy of the simulation was well verified by experimental data taken under various conditions, even near sea level. Based on a comprehensive analysis of the simulation results, we proposed an analytical model for estimating the cosmic-ray spectra of neutrons, protons, helium ions, muons, electrons, positrons and photons applicable to any location in the atmosphere at altitudes below 20 km. Our model, named PARMA, enables us to calculate the cosmic radiation doses rapidly with a precision equivalent to that of the Monte Carlo simulation, which requires much more computational time. With these properties, PARMA is capable of improving the accuracy and efficiency of the cosmic-ray exposure dose estimations not only for aircrews but also for the public on the ground.
NASA Astrophysics Data System (ADS)
Ayuso, David; Decleva, Piero; Patchkovskii, Serguei; Smirnova, Olga
2018-06-01
The generation of high-order harmonics in a medium of chiral molecules driven by intense bi-elliptical laser fields can lead to strong chiroptical response in a broad range of harmonic numbers and ellipticities (Ayuso et al 2018 J. Phys. B: At. Mol. Opt. Phys. 51 06LT01). Here we present a comprehensive analytical model that can describe the most relevant features arising in the high-order harmonic spectra of chiral molecules driven by strong bi-elliptical fields. Our model recovers the physical picture underlying chiral high-order harmonic generation (HHG) based on ultrafast chiral hole motion and identifies the rotationally invariant molecular pseudoscalars responsible for chiral dynamics. Using the chiral molecule propylene oxide as an example, we show that one can control and enhance the chiral response in bi-elliptical HHG by tailoring the driving field, in particular by tuning its frequency, intensity and ellipticity, exploiting a suppression mechanism of achiral background based on the linear Stark effect.
2D modeling based comprehensive analysis of short channel effects in DMG strained VSTB FET
NASA Astrophysics Data System (ADS)
Saha, Priyanka; Banerjee, Pritha; Sarkar, Subir Kumar
2018-06-01
The paper aims to develop two dimensional analytical model of the proposed dual material (DM) Vertical Super Thin Body (VSTB) strained Field Effect Transistor (FET) with focus on its short channel behaviour in nanometer regime. Electrostatic potential across gate/channel and dielectric wall/channel interface is derived by solving 2D Poisson's equation with parabolic approximation method by applying appropriate boundary conditions. Threshold voltage is then calculated by using the criteria of minimum surface potential considering both gate and dielectric wall side potential. Performance analysis of the present structure is demonstrated in terms of potential, electric field, threshold voltage characteristics and subthreshold behaviour by varying various device parameters and applied biases. Effect of application of strain in channel is further explored to establish the superiority of the proposed device in comparison to conventional VSTB FET counterpart. All analytical results are compared with Silvaco ATLAS device simulated data to substantiate the accuracy of our derived model.
NASA Astrophysics Data System (ADS)
Bao, Cheng; Jiang, Zeyi; Zhang, Xinxin
2015-10-01
Fuel flexibility is a significant advantage of solid oxide fuel cell (SOFC). A comprehensive macroscopic framework is proposed for synthesis gas (syngas) fueled electrochemistry and transport in SOFC anode with two main novelties, i.e. analytical H2/CO electrochemical co-oxidation, and correction of gas species concentration at triple phase boundary considering competitive absorption and surface diffusion. Staring from analytical approximation of the decoupled charge and mass transfer, we present analytical solutions of two defined variables, i.e. hydrogen current fraction and enhancement factor. Giving explicit answer (rather than case-by-case numerical calculation) on how many percent of the current output contributed by H2 or CO and on how great the water gas shift reaction plays role on, this approach establishes at the first time an adaptive superposition mechanism of H2-fuel and CO-fuel electrochemistry for syngas fuel. Based on the diffusion equivalent circuit model, assuming series-connected resistances of surface diffusion and bulk diffusion, the model predicts well at high fuel utilization by keeping fixed porosity/tortuosity ratio. The model has been validated by experimental polarization behaviors in a wide range of operation on a button cell for H2-H2O-CO-CO2-N2 fuel systems. The framework could be helpful to narrow the gap between macro-scale and meso-scale SOFC modeling.
Zou, Ping; Luo, Pei-Gao
2010-05-01
Chemistry is an important group of basic courses, while genetics is one of the important major-basic courses in curriculum of many majors in agricultural institutes or universities. In order to establish the linkage between the major course and the basic course, the ability of application of the chemical knowledge previously learned in understanding genetic knowledge in genetics teaching is worthy of discussion for genetics teachers. In this paper, the authors advocate to apply some chemical knowledge previously learned to understand genetic knowledge in genetics teaching with infiltrative model, which could help students learn and understand genetic knowledge more deeply. Analysis of the intrinsic logistic relationship among the knowledge of different courses and construction of the integral knowledge network are useful for students to improve their analytic, comprehensive and logistic abilities. By this way, we could explore a new teaching model to develop the talents with new ideas and comprehensive competence in agricultural fields.
Cost-effectiveness modelling in diagnostic imaging: a stepwise approach.
Sailer, Anna M; van Zwam, Wim H; Wildberger, Joachim E; Grutters, Janneke P C
2015-12-01
Diagnostic imaging (DI) is the fastest growing sector in medical expenditures and takes a central role in medical decision-making. The increasing number of various and new imaging technologies induces a growing demand for cost-effectiveness analysis (CEA) in imaging technology assessment. In this article we provide a comprehensive framework of direct and indirect effects that should be considered for CEA in DI, suitable for all imaging modalities. We describe and explain the methodology of decision analytic modelling in six steps aiming to transfer theory of CEA to clinical research by demonstrating key principles of CEA in a practical approach. We thereby provide radiologists with an introduction to the tools necessary to perform and interpret CEA as part of their research and clinical practice. • DI influences medical decision making, affecting both costs and health outcome. • This article provides a comprehensive framework for CEA in DI. • A six-step methodology for conducting and interpreting cost-effectiveness modelling is proposed.
Hanser, S.E.; Leu, M.; Knick, S.T.; Aldridge, Cameron L.
2011-01-01
The Wyoming Basins are one of the remaining strongholds of the sagebrush ecosystem. However, like most sagebrush habitats, threats to this region are numerous. This book adds to current knowledge about the regional status of the sagebrush ecosystem, the distribution of habitats, the threats to the ecosystem, and the influence of threats and habitat conditions on occurrence and abundance of sagebrush associated fauna and flora in the Wyoming Basins. Comprehensive methods are outlined for use in data collection and monitoring of wildlife and plant populations. Field and spatial data are integrated into a spatially explicit analytical framework to develop models of species occurrence and abundance for the egion. This book provides significant new information on distributions, abundances, and habitat relationships for a number of species of conservation concern that depend on sagebrush in the region. The tools and models presented in this book increase our understanding of impacts from land uses and can contribute to the development of comprehensive management and conservation strategies.
NASA Technical Reports Server (NTRS)
Wilbur, Matthew L.; Yeager, William T., Jr.; Sekula, Martin K.
2002-01-01
The vibration reduction capabilities of a model rotor system utilizing controlled, strain-induced blade twisting are examined. The model rotor blades, which utilize piezoelectric active fiber composite actuators, were tested in the NASA Langley Transonic Dynamics Tunnel using open-loop control to determine the effect of active-twist on rotor vibratory loads. The results of this testing have been encouraging, and have demonstrated that active-twist rotor designs offer the potential for significant load reductions in future helicopter rotor systems. Active twist control was found to use less than 1% of the power necessary to operate the rotor system and had a pronounced effect on both rotating- and fixed-system loads, offering reductions in individual harmonic loads of up to 100%. A review of the vibration reduction results obtained is presented, which includes a limited set of comparisons with results generated using the second-generation version of the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD II) rotorcraft comprehensive analysis.
Newspaper Reading among College Students in Development of Their Analytical Ability
ERIC Educational Resources Information Center
Kumar, Dinesh
2009-01-01
The study investigated the newspaper reading among college students in development of their analytical ability. Newspapers are one of the few sources of information that are comprehensive, interconnected and offered in one format. The main objective of the study was to find out the development of the analytical ability among college students by…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutmacher, R.; Crawford, R.
This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.
NASA Technical Reports Server (NTRS)
Boyd, D. Douglas, Jr.; Burley, Casey L.; Conner, David A.
2005-01-01
The Comprehensive Analytical Rotorcraft Model for Acoustics (CARMA) is being developed under the Quiet Aircraft Technology Project within the NASA Vehicle Systems Program. The purpose of CARMA is to provide analysis tools for the design and evaluation of efficient low-noise rotorcraft, as well as support the development of safe, low-noise flight operations. The baseline prediction system of CARMA is presented and current capabilities are illustrated for a model rotor in a wind tunnel, a rotorcraft in flight and for a notional coaxial rotor configuration; however, a complete validation of the CARMA system capabilities with respect to a variety of measured databases is beyond the scope of this work. For the model rotor illustration, predicted rotor airloads and acoustics for a BO-105 model rotor are compared to test data from HART-II. For the flight illustration, acoustic data from an MD-520N helicopter flight test, which was conducted at Eglin Air Force Base in September 2003, are compared with CARMA full vehicle flight predictions. Predicted acoustic metrics at three microphone locations are compared for limited level flight and descent conditions. Initial acoustic predictions using CARMA for a notional coaxial rotor system are made. The effect of increasing the vertical separation between the rotors on the predicted airloads and acoustic results are shown for both aerodynamically non-interacting and aerodynamically interacting rotors. The sensitivity of including the aerodynamic interaction effects of each rotor on the other, especially when the rotors are in close proximity to one another is initially examined. The predicted coaxial rotor noise is compared to that of a conventional single rotor system of equal thrust, where both are of reasonable size for an unmanned aerial vehicle (UAV).
Fasoli, Diego; Cattani, Anna; Panzeri, Stefano
2018-05-01
Despite their biological plausibility, neural network models with asymmetric weights are rarely solved analytically, and closed-form solutions are available only in some limiting cases or in some mean-field approximations. We found exact analytical solutions of an asymmetric spin model of neural networks with arbitrary size without resorting to any approximation, and we comprehensively studied its dynamical and statistical properties. The network had discrete time evolution equations and binary firing rates, and it could be driven by noise with any distribution. We found analytical expressions of the conditional and stationary joint probability distributions of the membrane potentials and the firing rates. By manipulating the conditional probability distribution of the firing rates, we extend to stochastic networks the associating learning rule previously introduced by Personnaz and coworkers. The new learning rule allowed the safe storage, under the presence of noise, of point and cyclic attractors, with useful implications for content-addressable memories. Furthermore, we studied the bifurcation structure of the network dynamics in the zero-noise limit. We analytically derived examples of the codimension 1 and codimension 2 bifurcation diagrams of the network, which describe how the neuronal dynamics changes with the external stimuli. This showed that the network may undergo transitions among multistable regimes, oscillatory behavior elicited by asymmetric synaptic connections, and various forms of spontaneous symmetry breaking. We also calculated analytically groupwise correlations of neural activity in the network in the stationary regime. This revealed neuronal regimes where, statistically, the membrane potentials and the firing rates are either synchronous or asynchronous. Our results are valid for networks with any number of neurons, although our equations can be realistically solved only for small networks. For completeness, we also derived the network equations in the thermodynamic limit of infinite network size and we analytically studied their local bifurcations. All the analytical results were extensively validated by numerical simulations.
Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling
2018-04-01
Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.
Analytical and Experimental Vibration Analysis of a Faulty Gear System.
1994-10-01
Wigner - Ville Distribution ( WVD ) was used to give a comprehensive comparison of the predicted and...experimental results. The WVD method applied to the experimental results were also compared to other fault detection techniques to verify the WVD’s ability to...of the damaged test gear and the predicted vibration from the model with simulated gear tooth pitting damage. Results also verified that the WVD method can successfully detect and locate gear tooth wear and pitting damage.
1983-12-01
while at the same time improving its operational efficiency. Through their integration and use, System Program Managers have a comprehensive analytical... systems . The NRLA program is hosted on the CREATE Operating System and contains approxiamately 5500 lines of computer code. It consists of a main...associated with C alternative maintenance plans. As the technological complexity of weapons systems has increased new and innovative logisitcal support
A ricin forensic profiling approach based on a complex set of biomarkers.
Fredriksson, Sten-Åke; Wunschel, David S; Lindström, Susanne Wiklund; Nilsson, Calle; Wahl, Karen; Åstot, Crister
2018-08-15
A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1-PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods and robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved. Copyright © 2018 Elsevier B.V. All rights reserved.
Network model for thermal conductivities of unidirectional fiber-reinforced composites
NASA Astrophysics Data System (ADS)
Wang, Yang; Peng, Chaoyi; Zhang, Weihua
2014-12-01
An empirical network model has been developed to predict the in-plane thermal conductivities along arbitrary directions for unidirectional fiber-reinforced composites lamina. Measurements of thermal conductivities along different orientations were carried out. Good agreement was observed between values predicted by the network model and the experimental data; compared with the established analytical models, the newly proposed network model could give values with higher precision. Therefore, this network model is helpful to get a wider and more comprehensive understanding of heat transmission characteristics of fiber-reinforced composites and can be utilized as guidance to design and fabricate laminated composites with specific directional or specific locational thermal conductivities for structures that simultaneously perform mechanical and thermal functions, i.e. multifunctional structures (MFS).
Curriculum Mapping with Academic Analytics in Medical and Healthcare Education.
Komenda, Martin; Víta, Martin; Vaitsis, Christos; Schwarz, Daniel; Pokorná, Andrea; Zary, Nabil; Dušek, Ladislav
2015-01-01
No universal solution, based on an approved pedagogical approach, exists to parametrically describe, effectively manage, and clearly visualize a higher education institution's curriculum, including tools for unveiling relationships inside curricular datasets. We aim to solve the issue of medical curriculum mapping to improve understanding of the complex structure and content of medical education programs. Our effort is based on the long-term development and implementation of an original web-based platform, which supports an outcomes-based approach to medical and healthcare education and is suitable for repeated updates and adoption to curriculum innovations. We adopted data exploration and visualization approaches in the context of medical curriculum innovations in higher education institutions domain. We have developed a robust platform, covering detailed formal metadata specifications down to the level of learning units, interconnections, and learning outcomes, in accordance with Bloom's taxonomy and direct links to a particular biomedical nomenclature. Furthermore, we used selected modeling techniques and data mining methods to generate academic analytics reports from medical curriculum mapping datasets. We present a solution that allows users to effectively optimize a curriculum structure that is described with appropriate metadata, such as course attributes, learning units and outcomes, a standardized vocabulary nomenclature, and a tree structure of essential terms. We present a case study implementation that includes effective support for curriculum reengineering efforts of academics through a comprehensive overview of the General Medicine study program. Moreover, we introduce deep content analysis of a dataset that was captured with the use of the curriculum mapping platform; this may assist in detecting any potentially problematic areas, and hence it may help to construct a comprehensive overview for the subsequent global in-depth medical curriculum inspection. We have proposed, developed, and implemented an original framework for medical and healthcare curriculum innovations and harmonization, including: planning model, mapping model, and selected academic analytics extracted with the use of data mining.
Curriculum Mapping with Academic Analytics in Medical and Healthcare Education
Komenda, Martin; Víta, Martin; Vaitsis, Christos; Schwarz, Daniel; Pokorná, Andrea; Zary, Nabil; Dušek, Ladislav
2015-01-01
Background No universal solution, based on an approved pedagogical approach, exists to parametrically describe, effectively manage, and clearly visualize a higher education institution’s curriculum, including tools for unveiling relationships inside curricular datasets. Objective We aim to solve the issue of medical curriculum mapping to improve understanding of the complex structure and content of medical education programs. Our effort is based on the long-term development and implementation of an original web-based platform, which supports an outcomes-based approach to medical and healthcare education and is suitable for repeated updates and adoption to curriculum innovations. Methods We adopted data exploration and visualization approaches in the context of medical curriculum innovations in higher education institutions domain. We have developed a robust platform, covering detailed formal metadata specifications down to the level of learning units, interconnections, and learning outcomes, in accordance with Bloom’s taxonomy and direct links to a particular biomedical nomenclature. Furthermore, we used selected modeling techniques and data mining methods to generate academic analytics reports from medical curriculum mapping datasets. Results We present a solution that allows users to effectively optimize a curriculum structure that is described with appropriate metadata, such as course attributes, learning units and outcomes, a standardized vocabulary nomenclature, and a tree structure of essential terms. We present a case study implementation that includes effective support for curriculum reengineering efforts of academics through a comprehensive overview of the General Medicine study program. Moreover, we introduce deep content analysis of a dataset that was captured with the use of the curriculum mapping platform; this may assist in detecting any potentially problematic areas, and hence it may help to construct a comprehensive overview for the subsequent global in-depth medical curriculum inspection. Conclusions We have proposed, developed, and implemented an original framework for medical and healthcare curriculum innovations and harmonization, including: planning model, mapping model, and selected academic analytics extracted with the use of data mining. PMID:26624281
NASA Astrophysics Data System (ADS)
Ferhati, H.; Djeffal, F.
2017-12-01
In this paper, a new MSM-UV-photodetector (PD) based on dual wide band-gap material (DM) engineering aspect is proposed to achieve high-performance self-powered device. Comprehensive analytical models for the proposed sensor photocurrent and the device properties are developed incorporating the impact of DM aspect on the device photoelectrical behavior. The obtained results are validated with the numerical data using commercial TCAD software. Our investigation demonstrates that the adopted design amendment modulates the electric field in the device, which provides the possibility to drive appropriate photo-generated carriers without an external applied voltage. This phenomenon suggests achieving the dual role of effective carriers' separation and an efficient reduce of the dark current. Moreover, a new hybrid approach based on analytical modeling and Particle Swarm Optimization (PSO) is proposed to achieve improved photoelectric behavior at zero bias that can ensure favorable self-powered MSM-based UV-PD. It is found that the proposed design methodology has succeeded in identifying the optimized design that offers a self-powered device with high-responsivity (98 mA/W) and superior ION/IOFF ratio (480 dB). These results make the optimized MSM-UV-DM-PD suitable for providing low cost self-powered devices for high-performance optical communication and monitoring applications.
NASA Technical Reports Server (NTRS)
Cantrell, John H., Jr.; Cantrell, Sean A.
2008-01-01
A comprehensive analytical model of the interaction of the cantilever tip of the atomic force microscope (AFM) with the sample surface is developed that accounts for the nonlinearity of the tip-surface interaction force. The interaction is modeled as a nonlinear spring coupled at opposite ends to linear springs representing cantilever and sample surface oscillators. The model leads to a pair of coupled nonlinear differential equations that are solved analytically using a standard iteration procedure. Solutions are obtained for the phase and amplitude signals generated by various acoustic-atomic force microscope (A-AFM) techniques including force modulation microscopy, atomic force acoustic microscopy, ultrasonic force microscopy, heterodyne force microscopy, resonant difference-frequency atomic force ultrasonic microscopy (RDF-AFUM), and the commonly used intermittent contact mode (TappingMode) generally available on AFMs. The solutions are used to obtain a quantitative measure of image contrast resulting from variations in the Young modulus of the sample for the amplitude and phase images generated by the A-AFM techniques. Application of the model to RDF-AFUM and intermittent soft contact phase images of LaRC-cp2 polyimide polymer is discussed. The model predicts variations in the Young modulus of the material of 24 percent from the RDF-AFUM image and 18 percent from the intermittent soft contact image. Both predictions are in good agreement with the literature value of 21 percent obtained from independent, macroscopic measurements of sheet polymer material.
A Concept Analysis of Holistic Care by Hybrid Model
Jasemi, Madineh; Valizadeh, Leila; Zamanzadeh, Vahid; Keogh, Brian
2017-01-01
Purpose: Even though holistic care has been widely discussed in the health care and professional nursing literature, there is no comprehensive definition of it. Therefore, the aim of this article is to present a concept analysis of holistic care which was developed using the hybrid model. Methods: The hybrid model comprises three phases. In the theoretical phase, characteristics of holistic care were identified through a review of the literature from CINAHL, MEDLINE, PubMed, OVID, and Google Scholar databases. During the fieldwork phase, in-depth interviews were conducted with eight nurses who were purposely selected. Finally, following an analysis of the literature and the qualitative interviews, a theoretical description of the concept of holistic care was extracted. Results: Two main themes were extracted of analytical phase: “Holistic care for offering a comprehensive model for caring” and “holistic care for improving patients' and nurses' conditions.” Conclusion: By undertaking a conceptual analysis of holistic care, its meaning can be clarified which will encourage nursing educators to include holistic care in nursing syllabi, and consequently facilitate its provision in practice. PMID:28216867
Localized mRNA translation and protein association
NASA Astrophysics Data System (ADS)
Zhdanov, Vladimir P.
2014-08-01
Recent direct observations of localization of mRNAs and proteins both in prokaryotic and eukaryotic cells can be related to slowdown of diffusion of these species due to macromolecular crowding and their ability to aggregate and form immobile or slowly mobile complexes. Here, a generic kinetic model describing both these factors is presented and comprehensively analyzed. Although the model is non-linear, an accurate self-consistent analytical solution of the corresponding reaction-diffusion equation has been constructed, the types of localized protein distributions have been explicitly shown, and the predicted kinetic regimes of gene expression have been classified.
An analytical model of a curved beam with a T shaped cross section
NASA Astrophysics Data System (ADS)
Hull, Andrew J.; Perez, Daniel; Cox, Donald L.
2018-03-01
This paper derives a comprehensive analytical dynamic model of a closed circular beam that has a T shaped cross section. The new model includes in-plane and out-of-plane vibrations derived using continuous media expressions which produces results that have a valid frequency range above those available from traditional lumped parameter models. The web is modeled using two-dimensional elasticity equations for in-plane motion and the classical flexural plate equation for out-of-plane motion. The flange is modeled using two sets of Donnell shell equations: one for the left side of the flange and one for the right side of the flange. The governing differential equations are solved with unknown wave propagation coefficients multiplied by spatial domain and time domain functions which are inserted into equilibrium and continuity equations at the intersection of the web and flange and into boundary conditions at the edges of the system resulting in 24 algebraic equations. These equations are solved to yield the wave propagation coefficients and this produces a solution to the displacement field in all three dimensions. An example problem is formulated and compared to results from finite element analysis.
ERIC Educational Resources Information Center
Ghoneim, Nahed Mohamed Mahmoud
2013-01-01
The current study focused on the problems which students encounter while listening to the English language, the mental processes they activate in listening comprehension, and the strategies they use in different phases of comprehension. Also, it aimed to find out whether there were any differences between advanced and intermediate students in…
NASA Astrophysics Data System (ADS)
Monfared, Vahid
2016-12-01
Analytically based model is presented for behavioral analysis of the plastic deformations in the reinforced materials using the circular (trigonometric) functions. The analytical method is proposed to predict creep behavior of the fibrous composites based on basic and constitutive equations under a tensile axial stress. New insight of the work is to predict some important behaviors of the creeping matrix. In the present model, the prediction of the behaviors is simpler than the available methods. Principal creep strain rate behaviors are very noteworthy for designing the fibrous composites in the creeping composites. Analysis of the mentioned parameter behavior in the reinforced materials is necessary to analyze failure, fracture, and fatigue studies in the creep of the short fiber composites. Shuttles, spaceships, turbine blades and discs, and nozzle guide vanes are commonly subjected to the creep effects. Also, predicting the creep behavior is significant to design the optoelectronic and photonic advanced composites with optical fibers. As a result, the uniform behavior with constant gradient is seen in the principal creep strain rate behavior, and also creep rupture may happen at the fiber end. Finally, good agreements are found through comparing the obtained analytical and FEM results.
APPLICATION OF THE MASTER ANALYTICAL SCHEME TO POLAR ORGANICS IN DRINKING WATER
EPA's Master Analytical Scheme (MAS) for Organic Compounds in Water provides for comprehensive qualitative-quantitative analysis of gas chromatographable organics in many types of water. The paper emphasizes the analysis of polar and ionic organics, the more water soluble compoun...
1980-06-01
sufficient. Dropping the time lag terms, the equations for Xu, Xx’, and X reduce to linear algebraic equations.Y Hence in the quasistatic case the...quasistatic variables now are not described by differential equations but rather by linear algebraic equations. The solution for x0 then is simply -365...matrices for two-bladed rotor 414 7. LINEAR SYSTEM ANALYSIS 425 7,1 State Variable Form 425 7.2 Constant Coefficient System 426 7.2. 1 Eigen-analysis 426
NASA Astrophysics Data System (ADS)
Moon, Joon-Young; Kim, Junhyeok; Ko, Tae-Wook; Kim, Minkyung; Iturria-Medina, Yasser; Choi, Jee-Hyun; Lee, Joseph; Mashour, George A.; Lee, Uncheol
2017-04-01
Identifying how spatially distributed information becomes integrated in the brain is essential to understanding higher cognitive functions. Previous computational and empirical studies suggest a significant influence of brain network structure on brain network function. However, there have been few analytical approaches to explain the role of network structure in shaping regional activities and directionality patterns. In this study, analytical methods are applied to a coupled oscillator model implemented in inhomogeneous networks. We first derive a mathematical principle that explains the emergence of directionality from the underlying brain network structure. We then apply the analytical methods to the anatomical brain networks of human, macaque, and mouse, successfully predicting simulation and empirical electroencephalographic data. The results demonstrate that the global directionality patterns in resting state brain networks can be predicted solely by their unique network structures. This study forms a foundation for a more comprehensive understanding of how neural information is directed and integrated in complex brain networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik
2017-05-08
Scientists often use specific data analysis and presentation methods familiar within their domain. But does high familiarity drive better analytical judgment? This question is especially relevant when familiar methods themselves can have shortcomings: many visualizations used conventionally for scientific data analysis and presentation do not follow established best practices. This necessitates new methods that might be unfamiliar yet prove to be more effective. But there is little empirical understanding of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their visual analytic judgments. To address this gap and to study these factors, we focusmore » on visualizations used for comparison of climate model performance. We report on a comprehensive survey-based user study with 47 climate scientists and present an analysis of : i) relationships among scientists’ familiarity, their perceived lev- els of comfort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less
NASA Astrophysics Data System (ADS)
Setty, Srinivas J.; Cefola, Paul J.; Montenbruck, Oliver; Fiedler, Hauke
2016-05-01
Catalog maintenance for Space Situational Awareness (SSA) demands an accurate and computationally lean orbit propagation and orbit determination technique to cope with the ever increasing number of observed space objects. As an alternative to established numerical and analytical methods, we investigate the accuracy and computational load of the Draper Semi-analytical Satellite Theory (DSST). The standalone version of the DSST was enhanced with additional perturbation models to improve its recovery of short periodic motion. The accuracy of DSST is, for the first time, compared to a numerical propagator with fidelity force models for a comprehensive grid of low, medium, and high altitude orbits with varying eccentricity and different inclinations. Furthermore, the run-time of both propagators is compared as a function of propagation arc, output step size and gravity field order to assess its performance for a full range of relevant use cases. For use in orbit determination, a robust performance of DSST is demonstrated even in the case of sparse observations, which is most sensitive to mismodeled short periodic perturbations. Overall, DSST is shown to exhibit adequate accuracy at favorable computational speed for the full set of orbits that need to be considered in space surveillance. Along with the inherent benefits of a semi-analytical orbit representation, DSST provides an attractive alternative to the more common numerical orbit propagation techniques.
Thinking Graphically: Connecting Vision and Cognition during Graph Comprehension
ERIC Educational Resources Information Center
Ratwani, Raj M.; Trafton, J. Gregory; Boehm-Davis, Deborah A.
2008-01-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive…
Policy Model of Sustainable Infrastructure Development (Case Study : Bandarlampung City, Indonesia)
NASA Astrophysics Data System (ADS)
Persada, C.; Sitorus, S. R. P.; Marimin; Djakapermana, R. D.
2018-03-01
Infrastructure development does not only affect the economic aspect, but also social and environmental, those are the main dimensions of sustainable development. Many aspects and actors involved in urban infrastructure development requires a comprehensive and integrated policy towards sustainability. Therefore, it is necessary to formulate an infrastructure development policy that considers various dimensions of sustainable development. The main objective of this research is to formulate policy of sustainable infrastructure development. In this research, urban infrastructure covers transportation, water systems (drinking water, storm water, wastewater), green open spaces and solid waste. This research was conducted in Bandarlampung City. This study use a comprehensive modeling, namely the Multi Dimensional Scaling (MDS) with Rapid Appraisal of Infrastructure (Rapinfra), it uses of Analytic Network Process (ANP) and it uses system dynamics model. The findings of the MDS analysis showed that the status of Bandarlampung City infrastructure sustainability is less sustainable. The ANP analysis produces 8 main indicators of the most influential in the development of sustainable infrastructure. The system dynamics model offered 4 scenarios of sustainable urban infrastructure policy model. The best scenario was implemented into 3 policies consist of: the integrated infrastructure management, the population control, and the local economy development.
There have been a number of revolutionary developments during the past decade that have led to a much more comprehensive understanding of per- and polyfluoroalkyl substances (PFASs) in the environment. Improvements in analytical instrumentation have made liquid chromatography tri...
Analytic institutes: A guide to training in the United States
NASA Astrophysics Data System (ADS)
Blanken, Terry G.
This investigation was inspired by the researcher's desire to pursue psychoanalytic training subsequent to completion of her PhD in clinical psychology and the discovery that no comprehensive resource existed to assist prospective psychoanalytic candidates with identifying or evaluating psychoanalytic training opportunities. This dissertation therefore aspires to provide a comprehensive guide to analytic training in the United States today. The researcher presents the expanding horizons of depth-oriented training leading to certification as an analyst, including training based on those schools of thought that resulted from early splits with Freud (Adlerian and Jungian) as well as training based on thought that has remained within the Freudian theoretical umbrella (e.g., classical, object relations, self psychology, etc.). Employing a heuristic approach and using hermeneutics and systems theory methodologies, the study situates analytic training in its historical context, explores contemporary issues, and considers its future. The study reviews the various analytic schools of thought and traces the history of psychoanalytic theory from its origins with Freud through its many permutations. It then discusses the history of psychoanalytic training and describes political, social, and economic factors influencing the development of training in this country. The centerpiece of the dissertation is a guidebook offering detailed information on each of 107 training institutes in the United States. Tables provide contact data and information which differentiate the institutes in terms of such parameters as size; length of program, theoretical orientation, and accreditation. A narrative of each institute summarizes the unique aspects of the program, including its admissions policy, the requirements for the training analysis and supervised clinical work, and the didactic curriculum, along with lists of courses offered. Child and adolescent psychoanalytic training is also discussed for institutes offering this option. A discussion of the contemporary world of analytic training emerges from the results of the analysis of individual institutes. Both the variations and convergences among institutes are explored. Current problems and issues in training, accreditation, and licensing are addressed. Finally, the future of psychoanalytic training is considered; concluding with an assessment of needed reforms and presentation of a model for the ideal analytic training institute of the future.
NASA Astrophysics Data System (ADS)
Dolman, A. M.; Laepple, T.; Kunz, T.
2017-12-01
Understanding the uncertainties associated with proxy-based reconstructions of past climate is critical if they are to be used to validate climate models and contribute to a comprehensive understanding of the climate system. Here we present two related and complementary approaches to quantifying proxy uncertainty. The proxy forward model (PFM) "sedproxy" bitbucket.org/ecus/sedproxy numerically simulates the creation, archiving and observation of marine sediment archived proxies such as Mg/Ca in foraminiferal shells and the alkenone unsaturation index UK'37. It includes the effects of bioturbation, bias due to seasonality in the rate of proxy creation, aliasing of the seasonal temperature cycle into lower frequencies, and error due to cleaning, processing and measurement of samples. Numerical PFMs have the advantage of being very flexible, allowing many processes to be modelled and assessed for their importance. However, as more and more proxy-climate data become available, their use in advanced data products necessitates rapid estimates of uncertainties for both the raw reconstructions, and their smoothed/derived products, where individual measurements have been aggregated to coarser time scales or time-slices. To address this, we derive closed-form expressions for power spectral density of the various error sources. The power spectra describe both the magnitude and autocorrelation structure of the error, allowing timescale dependent proxy uncertainty to be estimated from a small number of parameters describing the nature of the proxy, and some simple assumptions about the variance of the true climate signal. We demonstrate and compare both approaches for time-series of the last millennia, Holocene, and the deglaciation. While the numerical forward model can create pseudoproxy records driven by climate model simulations, the analytical model of proxy error allows for a comprehensive exploration of parameter space and mapping of climate signal re-constructability, conditional on the climate and sampling conditions.
NASA Astrophysics Data System (ADS)
Gao, Chen; Ding, Zhongan; Deng, Bofa; Yan, Shengteng
2017-10-01
According to the characteristics of electric energy data acquire system (EEDAS), considering the availability of each index data and the connection between the index integrity, establishing the performance evaluation index system of electric energy data acquire system from three aspects as master station system, communication channel, terminal equipment. To determine the comprehensive weight of each index based on triangular fuzzy number analytic hierarchy process with entropy weight method, and both subjective preference and objective attribute are taken into consideration, thus realize the performance comprehensive evaluation more reasonable and reliable. Example analysis shows that, by combination with analytic hierarchy process (AHP) and triangle fuzzy numbers (TFN) to establish comprehensive index evaluation system based on entropy method, the evaluation results not only convenient and practical, but also more objective and accurate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
v, Mitroshkov; JV, Ryan
2016-04-07
Multicollector ICP-MS was used to comprehensively analyze different types of isotopically-modified glass created in order to investigate the processes of glass corrosion in the water. The analytical methods were developed for the analyses of synthesized, isotopically-modified solid glass and the release of glass constituents upon contact with deionized water. To validate the methods, results from an acid digestion sample of the Analytical Reference Glass (ARG) showed good agreement when compared to data from multiple prior analyses on the same glass [Smith-1]. In this paper, we present the results of this comprehensive analysis from the acid digestion of six types ofmore » isotopically-modified glass and the release of glass constituents into water corrosion after one year of aqueous corrosion.« less
Instructional Implications of Inquiry in Reading Comprehension.
ERIC Educational Resources Information Center
Snow, David
A contract deliverable on the NIE Communication Skills Project, this report consists of three separate documents describing the instructional implications of the analytic and empirical work carried out for the "Classroom Instruction in Reading Comprehension" part of the project: (1) Guidelines for Phrasal Segmentation; (2) Parsing Tasks…
Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate
NASA Technical Reports Server (NTRS)
Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.
2013-01-01
There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable
NASA Astrophysics Data System (ADS)
Meng, Fanchao; Chen, Cheng; Hu, Dianyin; Song, Jun
2017-12-01
Combining atomistic simulations and continuum modeling, a comprehensive study of the out-of-plane compressive deformation behaviors of equilateral three-dimensional (3D) graphene honeycombs was performed. It was demonstrated that under out-of-plane compression, the honeycomb exhibits two critical deformation events, i.e., elastic mechanical instability (including elastic buckling and structural transformation) and inelastic structural collapse. The above events were shown to be strongly dependent on the honeycomb cell size and affected by the local atomic bonding at the cell junction. By treating the 3D graphene honeycomb as a continuum cellular solid, and accounting for the structural heterogeneity and constraint at the junction, a set of analytical models were developed to accurately predict the threshold stresses corresponding to the onset of those deformation events. The present study elucidates key structure-property relationships of 3D graphene honeycombs under out-of-plane compression, and provides a comprehensive theoretical framework to predictively analyze their deformation responses, and more generally, offers critical new knowledge for the rational bottom-up design of 3D networks of two-dimensional nanomaterials.
[Multi-mathematical modelings for compatibility optimization of Jiangzhi granules].
Yang, Ming; Zhang, Li; Ge, Yingli; Lu, Yanliu; Ji, Guang
2011-12-01
To investigate into the method of "multi activity index evaluation and combination optimized of mult-component" for Chinese herbal formulas. According to the scheme of uniform experimental design, efficacy experiment, multi index evaluation, least absolute shrinkage, selection operator (LASSO) modeling, evolutionary optimization algorithm, validation experiment, we optimized the combination of Jiangzhi granules based on the activity indexes of blood serum ALT, ALT, AST, TG, TC, HDL, LDL and TG level of liver tissues, ratio of liver tissue to body. Analytic hierarchy process (AHP) combining with criteria importance through intercriteria correlation (CRITIC) for multi activity index evaluation was more reasonable and objective, it reflected the information of activity index's order and objective sample data. LASSO algorithm modeling could accurately reflect the relationship between different combination of Jiangzhi granule and the activity comprehensive indexes. The optimized combination of Jiangzhi granule showed better values of the activity comprehensive indexed than the original formula after the validation experiment. AHP combining with CRITIC can be used for multi activity index evaluation and LASSO algorithm, it is suitable for combination optimized of Chinese herbal formulas.
NASA Astrophysics Data System (ADS)
Liu, H.; Liu, Y.; Wang, X.; Liu, J.
2018-04-01
Good ecological environment is the foundation of human existence and development, the development of society and economy must be based on the premise of maintaining the stability and balance of the ecological environment. RS and GIS technology are used in this paper while the red-bed hills of Sichuan Province-Lu County have been taken as an example. According to the ecological environment characteristics of the study areas and the principle of choosing evaluation index, this paper selected six evaluation indicators (elevation, slope, aspect, vegetation cover, land use, gully density) to establish evaluation index system of ecological environment of Lu County. This paper determine the weight of each evaluation index by AHP (Analytic Hierarchy Process) and establishes a comprehensive evaluation model by the weighted comprehensive evaluation method. This model is used to divide the ecological environment quality of Lu County into excellent, good, middle, poor and worse, and to analyze the ecological environment change in Lu County in recent ten years.
Three-dimensional circulation dynamics of along-channel flow in stratified estuaries
NASA Astrophysics Data System (ADS)
Musiak, Jeffery Daniel
Estuaries are vital because they are the major interface between humans and the oceans and provide valuable habitat for a wide range of organisms. Therefore it is important to model estuarine circulation to gain a better comprehension of the mechanics involved and how people effect estuaries. To this end, this dissertation combines analysis of data collected in the Columbia River estuary (CRE) with novel data processing and modeling techniques to further the understanding of estuaries that are strongly forced by riverflow and tides. The primary hypothesis tested in this work is that the three- dimensional (3-D) variability in along-channel currents in a strongly forced estuary can be largely accounted for by including the lateral variations in density and bathymetry but neglecting the secondary, or lateral, flow. Of course, the forcing must also include riverflow and oceanic tides. Incorporating this simplification and the modeling ideas put forth by others with new modeling techniques and new ideas on estuarine circulation will allow me to create a semi-analytical quasi 3-D profile model. This approach was chosen because it is of intermediate complexity to purely analytical models, that, if tractable, are too simple to be useful, and 3-D numerical models which can have excellent resolution but require large amounts of time, computer memory and computing power. Validation of the model will be accomplished using velocity and density data collected in the Columbia River Estuary and by comparison to analytical solutions. Components of the modeling developed here include: (1) development of a 1-D barotropic model for tidal wave propagation in frictionally dominated systems with strong topography. This model can have multiple tidal constituents and multiply connected channels. (2) Development and verification of a new quasi 3-D semi-analytical velocity profile model applicable to estuarine systems which are strongly forced by both oceanic tides and riverflow. This model includes diurnal and semi-diurnal tidal and non- linearly generated overtide circulation and residual circulation driven by riverflow, baroclinic forcing, surface wind stress and non-linear tidal forcing. (3) Demonstration that much of the lateral variation in along-channel currents is caused by variations in along- channel density forcing and bathymetry.
Is Word-Problem Solving a Form of Text Comprehension?
Fuchs, Lynn S.; Fuchs, Douglas; Compton, Donald L.; Hamlett, Carol L.; Wang, Amber Y.
2015-01-01
This study’s hypotheses were that (a) word-problem (WP) solving is a form of text comprehension that involves language comprehension processes, working memory, and reasoning, but (b) WP solving differs from other forms of text comprehension by requiring WP-specific language comprehension as well as general language comprehension. At the start of the 2nd grade, children (n = 206; on average, 7 years, 6 months) were assessed on general language comprehension, working memory, nonlinguistic reasoning, processing speed (a control variable), and foundational skill (arithmetic for WPs; word reading for text comprehension). In spring, they were assessed on WP-specific language comprehension, WPs, and text comprehension. Path analytic mediation analysis indicated that effects of general language comprehension on text comprehension were entirely direct, whereas effects of general language comprehension on WPs were partially mediated by WP-specific language. By contrast, effects of working memory and reasoning operated in parallel ways for both outcomes. PMID:25866461
Abdul-Karim, Nadia; Blackman, Christopher S; Gill, Philip P; Karu, Kersti
2016-10-05
The continued usage of explosive devices, as well as the ever growing threat of 'dirty' bombs necessitates a comprehensive understanding of particle dispersal during detonation events in order to develop effectual methods for targeting explosive and/or additive remediation efforts. Herein, the distribution of explosive analytes from controlled detonations of aluminised ammonium nitrate and an RDX-based explosive composition were established by systematically sampling sites positioned around each firing. This is the first experimental study to produce evidence that the post-blast residue mass can distribute according to an approximate inverse-square law model, while also demonstrating for the first time that distribution trends can vary depending on individual analytes. Furthermore, by incorporating blast-wave overpressure measurements, high-speed imaging for fireball volume recordings, and monitoring of environmental conditions, it was determined that the principle factor affecting all analyte dispersals was the wind direction, with other factors affecting specific analytes to varying degrees. The dispersal mechanism for explosive residue is primarily the smoke cloud, a finding which in itself has wider impacts on the environment and fundamental detonation theory. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
James, C. M.; Gildfind, D. E.; Lewis, S. W.; Morgan, R. G.; Zander, F.
2018-03-01
Expansion tubes are an important type of test facility for the study of planetary entry flow-fields, being the only type of impulse facility capable of simulating the aerothermodynamics of superorbital planetary entry conditions from 10 to 20 km/s. However, the complex flow processes involved in expansion tube operation make it difficult to fully characterise flow conditions, with two-dimensional full facility computational fluid dynamics simulations often requiring tens or hundreds of thousands of computational hours to complete. In an attempt to simplify this problem and provide a rapid flow condition prediction tool, this paper presents a validated and comprehensive analytical framework for the simulation of an expansion tube facility. It identifies central flow processes and models them from state to state through the facility using established compressible and isentropic flow relations, and equilibrium and frozen chemistry. How the model simulates each section of an expansion tube is discussed, as well as how the model can be used to simulate situations where flow conditions diverge from ideal theory. The model is then validated against experimental data from the X2 expansion tube at the University of Queensland.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohi, J.
Supporting analysis and assessments can provide a sound analytic foundation and focus for program planning, evaluation, and coordination, particularly if issues of hydrogen production, distribution, storage, safety, and infrastructure can be analyzed in a comprehensive and systematic manner. The overall purpose of this activity is to coordinate all key analytic tasks-such as technology and market status, opportunities, and trends; environmental costs and benefits; and regulatory constraints and opportunities-within a long-term and systematic analytic foundation for program planning and evaluation. Within this context, the purpose of the project is to help develop and evaluate programmatic pathway options that incorporate near andmore » mid-term strategies to achieve the long-term goals of the Hydrogen Program. In FY 95, NREL will develop a comprehensive effort with industry, state and local agencies, and other federal agencies to identify and evaluate programmatic pathway options to achieve the long-term goals of the Program. Activity to date is reported.« less
MASS SPECTROMETRY-BASED METABOLOMICS
Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.
2007-01-01
This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475
Using Big Data Analytics to Advance Precision Radiation Oncology.
McNutt, Todd R; Benedict, Stanley H; Low, Daniel A; Moore, Kevin; Shpitser, Ilya; Jiang, Wei; Lakshminarayanan, Pranav; Cheng, Zhi; Han, Peijin; Hui, Xuan; Nakatsugawa, Minoru; Lee, Junghoon; Moore, Joseph A; Robertson, Scott P; Shah, Veeraj; Taylor, Russ; Quon, Harry; Wong, John; DeWeese, Theodore
2018-06-01
Big clinical data analytics as a primary component of precision medicine is discussed, identifying where these emerging tools fit in the spectrum of genomics and radiomics research. A learning health system (LHS) is conceptualized that uses clinically acquired data with machine learning to advance the initiatives of precision medicine. The LHS is comprehensive and can be used for clinical decision support, discovery, and hypothesis derivation. These developing uses can positively impact the ultimate management and therapeutic course for patients. The conceptual model for each use of clinical data, however, is different, and an overview of the implications is discussed. With advancements in technologies and culture to improve the efficiency, accuracy, and breadth of measurements of the patient condition, the concept of an LHS may be realized in precision radiation therapy. Copyright © 2018 Elsevier Inc. All rights reserved.
Electrostatic shock structures in dissipative multi-ion dusty plasmas
NASA Astrophysics Data System (ADS)
Elkamash, I. S.; Kourakis, I.
2018-06-01
A comprehensive analytical model is introduced for shock excitations in dusty bi-ion plasma mixtures, taking into account collisionality and kinematic (fluid) viscosity. A multicomponent plasma configuration is considered, consisting of positive ions, negative ions, electrons, and a massive charged component in the background (dust). The ionic dynamical scale is focused upon; thus, electrons are assumed to be thermalized, while the dust is stationary. A dissipative hybrid Korteweg-de Vries/Burgers equation is derived. An analytical solution is obtained, in the form of a shock structure (a step-shaped function for the electrostatic potential, or an electric field pulse) whose maximum amplitude in the far downstream region decays in time. The effect of relevant plasma configuration parameters, in addition to dissipation, is investigated. Our work extends earlier studies of ion-acoustic type shock waves in pure (two-component) bi-ion plasma mixtures.
NASA Astrophysics Data System (ADS)
Liang, Wei; Yu, Xuchao; Zhang, Laibin; Lu, Wenqing
2018-05-01
In oil transmission station, the operating condition (OC) of an oil pump unit sometimes switches accordingly, which will lead to changes in operating parameters. If not taking the switching of OCs into consideration while performing a state evaluation on the pump unit, the accuracy of evaluation would be largely influenced. Hence, in this paper, a self-organization Comprehensive Real-Time State Evaluation Model (self-organization CRTSEM) is proposed based on OC classification and recognition. However, the underlying model CRTSEM is built through incorporating the advantages of Gaussian Mixture Model (GMM) and Fuzzy Comprehensive Evaluation Model (FCEM) first. That is to say, independent state models are established for every state characteristic parameter according to their distribution types (i.e. the Gaussian distribution and logistic regression distribution). Meanwhile, Analytic Hierarchy Process (AHP) is utilized to calculate the weights of state characteristic parameters. Then, the OC classification is determined by the types of oil delivery tasks, and CRTSEMs of different standard OCs are built to constitute the CRTSEM matrix. On the other side, the OC recognition is realized by a self-organization model that is established on the basis of Back Propagation (BP) model. After the self-organization CRTSEM is derived through integration, real-time monitoring data can be inputted for OC recognition. At the end, the current state of the pump unit can be evaluated by using the right CRTSEM. The case study manifests that the proposed self-organization CRTSEM can provide reasonable and accurate state evaluation results for the pump unit. Besides, the assumption that the switching of OCs will influence the results of state evaluation is also verified.
USDA-ARS?s Scientific Manuscript database
Most analytical methods for persistent organic pollutants (POPs) focus on targeted analytes. Therefore, analysis of multiple classes of POPs typically entails several sample preparations, fractionations, and injections, whereas other chemicals of possible interest are neglected. To analyze a wider...
Samanipour, Saer; Dimitriou-Christidis, Petros; Gros, Jonas; Grange, Aureline; Samuel Arey, J
2015-01-02
Comprehensive two-dimensional gas chromatography (GC×GC) is used widely to separate and measure organic chemicals in complex mixtures. However, approaches to quantify analytes in real, complex samples have not been critically assessed. We quantified 7 PAHs in a certified diesel fuel using GC×GC coupled to flame ionization detector (FID), and we quantified 11 target chlorinated hydrocarbons in a lake water extract using GC×GC with electron capture detector (μECD), further confirmed qualitatively by GC×GC with electron capture negative chemical ionization time-of-flight mass spectrometer (ENCI-TOFMS). Target analyte peak volumes were determined using several existing baseline correction algorithms and peak delineation algorithms. Analyte quantifications were conducted using external standards and also using standard additions, enabling us to diagnose matrix effects. We then applied several chemometric tests to these data. We find that the choice of baseline correction algorithm and peak delineation algorithm strongly influence the reproducibility of analyte signal, error of the calibration offset, proportionality of integrated signal response, and accuracy of quantifications. Additionally, the choice of baseline correction and the peak delineation algorithm are essential for correctly discriminating analyte signal from unresolved complex mixture signal, and this is the chief consideration for controlling matrix effects during quantification. The diagnostic approaches presented here provide guidance for analyte quantification using GC×GC. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina
2018-01-01
The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.
We report a comprehensive analysis of 412 muscle-invasive bladder cancers characterized by multiple TCGA analytical platforms. Fifty-eight genes were significantly mutated, and the overall mutational load was associated with APOBEC-signature mutagenesis. Clustering by mutation signature identified a high-mutation subset with 75% 5-year survival.
Effect of EFL Students' Reading Styles on Their Reading Comprehension Performance.
ERIC Educational Resources Information Center
Amer, Aly Anwar; Khouzam, Naguib
1993-01-01
Investigates differences between English-as-a-Foreign-Language students at two levels of reading comprehension performance with respect to the global and analytic reading styles. No significant differences were found with regard to meaning memorization. There were slightly significant differences in favor of the global style with reference to…
Investigating Student Choices in Performing Higher-Level Comprehension Tasks Using TED
ERIC Educational Resources Information Center
Bianchi, Francesca; Marenzi, Ivana
2016-01-01
The current paper describes a first experiment in the use of TED talks and open tagging exercises to train higher-level comprehension skills, and of automatic logging of the student's actions to investigate the student choices while performing analytical tasks. The experiment took advantage of an interactive learning platform--LearnWeb--that…
Modeling and analysis of a resonant nanosystem
NASA Astrophysics Data System (ADS)
Calvert, Scott L.
The majority of investigations into nanoelectromechanical resonators focus on a single area of the resonator's function. This focus varies from the development of a model for a beam's vibration, to the modeling of electrostatic forces, to a qualitative explanation of experimentally-obtained currents. Despite these efforts, there remains a gap between these works, and the level of sophistication needed to truly design nanoresonant systems for efficient commercial use. Towards this end, a comprehensive system model for both a nanobeam resonator and its related experimental setup is proposed. Furthermore, a simulation arrangement is suggested as a method for facilitating the study of the system-level behavior of these devices in a variety of cases that could not be easily obtained experimentally or analytically. The dynamics driving the nanoresonator's motion, as well as the electrical interactions influencing the forcing and output of the system, are modeled, experimentally validated, and studied. The model seeks to develop both a simple circuit representation of the nanoresonator, and to create a mathematical system that can be used to predict and interpret the observed behavior. Due to the assumptions used to simplify the model to a point of reasonable comprehension, the model is most accurate for small beam deflections near the first eigenmode of the beam. The process and results of an experimental investigation are documented, and compared with a circuit simulation modeling the full test system. The comparison qualitatively proves the functionality of the model, while a numerical analysis serves to validate the functionality and setup of the circuit simulation. The use of the simulation enables a much broader investigation of both the electrical behavior and the physical device's dynamics. It is used to complement an assessment of the tuning behavior of the system's linear natural frequency by demonstrating the tuning behavior of the full nonlinear response. The simulation is used to demonstrate the difficulties with the contemporary mixing approach to experimental data collection and to complete a variety of case studies investigating the use of the nanoresonator systems in practical applications, such as signal filtering. Many of these case studies would be difficult to complete analytically, but results are quickly achieved through the use of the simulation.
Kuroishi, Rita Cristina Sadako; Garcia, Ricardo Basso; Valera, Fabiana Cardoso Pereira; Anselmo-Lima, Wilma Terezinha; Fukuda, Marisa Tomoe Hebihara
2015-01-01
Mouth breathing syndrome is very common among school-age children, and it is possibly related to learning difficulties and low academic achievement. In this study, we investigated working memory, reading comprehension and arithmetic skills in children with nasal and mouth breathing. Analytical cross-sectional study with control group conducted in a public university hospital. 42 children (mean age = 8.7 years) who had been identified as mouth breathers were compared with a control group (mean age = 8.4 years) matched for age and schooling. All the participants underwent a clinical interview, tone audiometry, otorhinolaryngological evaluation and cognitive assessment of phonological working memory (numbers and pseudowords), reading comprehension and arithmetic skills. Children with mouth breathing had poorer performance than controls, regarding reading comprehension (P = 0.006), arithmetic (P = 0.025) and working memory for pseudowords (P = 0.002), but not for numbers (P = 0.76). Children with mouth breathing have low academic achievement and poorer phonological working memory than controls. Teachers and healthcare professionals should be aware of the association of mouth breathing with children's physical and cognitive health.
Kim, Sung-Jin; Reidy, Shaelah M; Block, Bruce P; Wise, Kensall D; Zellers, Edward T; Kurabayashi, Katsuo
2010-07-07
In comprehensive two-dimensional gas chromatography (GC x GC), a modulator is placed at the juncture between two separation columns to focus and re-inject eluting mixture components, thereby enhancing the resolution and the selectivity of analytes. As part of an effort to develop a microGC x microGC prototype, in this report we present the design, fabrication, thermal operation, and initial testing of a two-stage microscale thermal modulator (microTM). The microTM contains two sequential serpentine Pyrex-on-Si microchannels (stages) that cryogenically trap analytes eluting from the first-dimension column and thermally inject them into the second-dimension column in a rapid, programmable manner. For each modulation cycle (typically 5 s for cooling with refrigeration work of 200 J and 100 ms for heating at 10 W), the microTM is kept approximately at -50 degrees C by a solid-state thermoelectric cooling unit placed within a few tens of micrometres of the device, and heated to 250 degrees C at 2800 degrees C s(-1) by integrated resistive microheaters and then cooled back to -50 degrees C at 250 degrees C s(-1). Thermal crosstalk between the two stages is less than 9%. A lumped heat transfer model is used to analyze the device design with respect to the rates of heating and cooling, power dissipation, and inter-stage thermal crosstalk as a function of Pyrex-membrane thickness, air-gap depth, and stage separation distance. Experimental results are in agreement with trends predicted by the model. Preliminary tests using a conventional capillary column interfaced to the microTM demonstrate the capability for enhanced sensitivity and resolution as well as the modulation of a mixture of alkanes.
NASA Astrophysics Data System (ADS)
Wasilah, S.; Fahmyddin, T.
2018-03-01
The employment of structural equation modeling (SEM) in research has taken an increasing attention in among researchers in built environment. There is a gap to understand the attributes, application, and importance of this approach in data analysis in built environment study. This paper intends to provide fundamental comprehension of SEM method in data analysis, unveiling attributes, employment and significance and bestow cases to assess associations amongst variables and constructs. The study uses some main literature to grasp the essence of SEM regarding with built environment research. The better acknowledgment of this analytical tool may assist the researcher in the built environment to analyze data under complex research questions and to test multivariate models in a single study.
Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew
2015-03-03
In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.
Bayesian Monte Carlo and Maximum Likelihood Approach for ...
Model uncertainty estimation and risk assessment is essential to environmental management and informed decision making on pollution mitigation strategies. In this study, we apply a probabilistic methodology, which combines Bayesian Monte Carlo simulation and Maximum Likelihood estimation (BMCML) to calibrate a lake oxygen recovery model. We first derive an analytical solution of the differential equation governing lake-averaged oxygen dynamics as a function of time-variable wind speed. Statistical inferences on model parameters and predictive uncertainty are then drawn by Bayesian conditioning of the analytical solution on observed daily wind speed and oxygen concentration data obtained from an earlier study during two recovery periods on a eutrophic lake in upper state New York. The model is calibrated using oxygen recovery data for one year and statistical inferences were validated using recovery data for another year. Compared with essentially two-step, regression and optimization approach, the BMCML results are more comprehensive and performed relatively better in predicting the observed temporal dissolved oxygen levels (DO) in the lake. BMCML also produced comparable calibration and validation results with those obtained using popular Markov Chain Monte Carlo technique (MCMC) and is computationally simpler and easier to implement than the MCMC. Next, using the calibrated model, we derive an optimal relationship between liquid film-transfer coefficien
A comprehensive alpha-heating model for inertial confinement fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christopherson, A. R.; Betti, R.; Bose, A.
In this paper, a comprehensive model is developed to study alpha-heating in inertially confined plasmas. It describes the time evolution of a central low-density hot spot confined by a compressible shell, heated by fusion alphas, and cooled by radiation and thermal losses. The model includes the deceleration, stagnation, and burn phases of inertial confinement fusion implosions, and is valid for sub-ignited targets with ≤10× amplification of the fusion yield from alpha-heating. The results of radiation-hydrodynamic simulations are used to derive realistic initial conditions and dimensionless parameters for the model. It is found that most of the alpha energy (~90%) producedmore » before bang time is deposited within the hot spot mass, while a small fraction (~10%) drives mass ablation off the inner shell surface and its energy is recycled back into the hot spot. Of the bremsstrahlung radiation emission, ~40% is deposited in the hot spot, ~40% is recycled back in the hot spot by ablation off the shell, and ~20% leaves the hot spot. We show here that the hot spot, shocked shell, and outer shell trajectories from this analytical model are in good agreement with simulations. Finally, a detailed discussion of the effect of alpha-heating on the hydrodynamics is also presented.« less
A comprehensive alpha-heating model for inertial confinement fusion
NASA Astrophysics Data System (ADS)
Christopherson, A. R.; Betti, R.; Bose, A.; Howard, J.; Woo, K. M.; Campbell, E. M.; Sanz, J.; Spears, B. K.
2018-01-01
A comprehensive model is developed to study alpha-heating in inertially confined plasmas. It describes the time evolution of a central low-density hot spot confined by a compressible shell, heated by fusion alphas, and cooled by radiation and thermal losses. The model includes the deceleration, stagnation, and burn phases of inertial confinement fusion implosions, and is valid for sub-ignited targets with ≤10 × amplification of the fusion yield from alpha-heating. The results of radiation-hydrodynamic simulations are used to derive realistic initial conditions and dimensionless parameters for the model. It is found that most of the alpha energy (˜90%) produced before bang time is deposited within the hot spot mass, while a small fraction (˜10%) drives mass ablation off the inner shell surface and its energy is recycled back into the hot spot. Of the bremsstrahlung radiation emission, ˜40% is deposited in the hot spot, ˜40% is recycled back in the hot spot by ablation off the shell, and ˜20% leaves the hot spot. We show here that the hot spot, shocked shell, and outer shell trajectories from this analytical model are in good agreement with simulations. A detailed discussion of the effect of alpha-heating on the hydrodynamics is also presented.
A comprehensive alpha-heating model for inertial confinement fusion
Christopherson, A. R.; Betti, R.; Bose, A.; ...
2018-01-08
In this paper, a comprehensive model is developed to study alpha-heating in inertially confined plasmas. It describes the time evolution of a central low-density hot spot confined by a compressible shell, heated by fusion alphas, and cooled by radiation and thermal losses. The model includes the deceleration, stagnation, and burn phases of inertial confinement fusion implosions, and is valid for sub-ignited targets with ≤10× amplification of the fusion yield from alpha-heating. The results of radiation-hydrodynamic simulations are used to derive realistic initial conditions and dimensionless parameters for the model. It is found that most of the alpha energy (~90%) producedmore » before bang time is deposited within the hot spot mass, while a small fraction (~10%) drives mass ablation off the inner shell surface and its energy is recycled back into the hot spot. Of the bremsstrahlung radiation emission, ~40% is deposited in the hot spot, ~40% is recycled back in the hot spot by ablation off the shell, and ~20% leaves the hot spot. We show here that the hot spot, shocked shell, and outer shell trajectories from this analytical model are in good agreement with simulations. Finally, a detailed discussion of the effect of alpha-heating on the hydrodynamics is also presented.« less
NASA Technical Reports Server (NTRS)
Baker, L. R.; Sulyma, P. R.; Tevepaugh, J. A.; Penny, M. M.
1976-01-01
Since exhaust plumes affect vehicle base environment (pressure and heat loads) and the orbiter vehicle aerodynamic control surface effectiveness, an intensive program involving detailed analytical and experimental investigations of the exhaust plume/vehicle interaction was undertaken as a pertinent part of the overall space shuttle development program. The program, called the Plume Technology program, has as its objective the determination of the criteria for simulating rocket engine (in particular, space shuttle propulsion system) plume-induced aerodynamic effects in a wind tunnel environment. The comprehensive experimental program was conducted using test facilities at NASA's Marshall Space Flight Center and Ames Research Center. A post-test examination of some of the experimental results obtained from NASA-MSFC's 14 x 14-inch trisonic wind tunnel is presented. A description is given of the test facility, simulant gas supply system, nozzle hardware, test procedure and test matrix. Analysis of exhaust plume flow fields and comparison of analytical and experimental exhaust plume data are presented.
Micromechanics Analysis Code With Generalized Method of Cells (MAC/GMC): User Guide. Version 3
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Bednarcyk, B. A.; Wilt, T. E.; Trowbridge, D.
1999-01-01
The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC/ GMC is a versatile form of research software that "drives" the double or triply periodic micromechanics constitutive models based upon GMC. MAC/GMC enhances the basic capabilities of GMC by providing a modular framework wherein 1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, 2) different integration algorithms may be selected, 3) a variety of material constitutive models (both deformation and life) may be utilized and/or implemented, and 4) a variety of fiber architectures (both unidirectional, laminate and woven) may be easily accessed through their corresponding representative volume elements contained within the supplied library of RVEs or input directly by the user, and 5) graphical post processing of the macro and/or micro field quantities is made available.
A Comprehensive Analytical Solution of the Nonlinear Pendulum
ERIC Educational Resources Information Center
Ochs, Karlheinz
2011-01-01
In this paper, an analytical solution for the differential equation of the simple but nonlinear pendulum is derived. This solution is valid for any time and is not limited to any special initial instance or initial values. Moreover, this solution holds if the pendulum swings over or not. The method of approach is based on Jacobi elliptic functions…
ERIC Educational Resources Information Center
Papamitsiou, Zacharoula; Economides, Anastasios A.
2014-01-01
This paper aims to provide the reader with a comprehensive background for understanding current knowledge on Learning Analytics (LA) and Educational Data Mining (EDM) and its impact on adaptive learning. It constitutes an overview of empirical evidence behind key objectives of the potential adoption of LA/EDM in generic educational strategic…
ERIC Educational Resources Information Center
OECD Publishing, 2017
2017-01-01
What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…
ERIC Educational Resources Information Center
Newcomer, Kathryn; Brass, Clinton T.
2016-01-01
The "performance movement" has been a subject of enthusiasm and frustration for evaluators. Performance measurement, data analytics, and program evaluation have been treated as different tasks, and those addressing them speak their own languages in their own circles. We suggest that situating performance measurement and data analytics…
Global dynamics in a stoichiometric food chain model with two limiting nutrients.
Chen, Ming; Fan, Meng; Kuang, Yang
2017-07-01
Ecological stoichiometry studies the balance of energy and multiple chemical elements in ecological interactions to establish how the nutrient content affect food-web dynamics and nutrient cycling in ecosystems. In this study, we formulate a food chain with two limiting nutrients in the form of a stoichiometric population model. A comprehensive global analysis of the rich dynamics of the targeted model is explored both analytically and numerically. Chaotic dynamic is observed in this simple stoichiometric food chain model and is compared with traditional model without stoichiometry. The detailed comparison reveals that stoichiometry can reduce the parameter space for chaotic dynamics. Our findings also show that decreasing producer production efficiency may have only a small effect on the consumer growth but a more profound impact on the top predator growth. Copyright © 2017 Elsevier Inc. All rights reserved.
Modeling and Visualizing Flow of Chemical Agents Across Complex Terrain
NASA Technical Reports Server (NTRS)
Kao, David; Kramer, Marc; Chaderjian, Neal
2005-01-01
Release of chemical agents across complex terrain presents a real threat to homeland security. Modeling and visualization tools are being developed that capture flow fluid terrain interaction as well as point dispersal downstream flow paths. These analytic tools when coupled with UAV atmospheric observations provide predictive capabilities to allow for rapid emergency response as well as developing a comprehensive preemptive counter-threat evacuation plan. The visualization tools involve high-end computing and massive parallel processing combined with texture mapping. We demonstrate our approach across a mountainous portion of North California under two contrasting meteorological conditions. Animations depicting flow over this geographical location provide immediate assistance in decision support and crisis management.
Solar radiation pressure resonances in Low Earth Orbits
NASA Astrophysics Data System (ADS)
Alessi, Elisa Maria; Schettino, Giulia; Rossi, Alessandro; Valsecchi, Giovanni B.
2018-01-01
The aim of this work is to highlight the crucial role that orbital resonances associated with solar radiation pressure can have in Low Earth Orbit. We review the corresponding literature, and provide an analytical tool to estimate the maximum eccentricity which can be achieved for well-defined initial conditions. We then compare the results obtained with the simplified model with the results obtained with a more comprehensive dynamical model. The analysis has important implications both from a theoretical point of view, because it shows that the role of some resonances was underestimated in the past, and also from a practical point of view in the perspective of passive deorbiting solutions for satellites at the end-of-life.
SSD for R: A Comprehensive Statistical Package to Analyze Single-System Data
ERIC Educational Resources Information Center
Auerbach, Charles; Schudrich, Wendy Zeitlin
2013-01-01
The need for statistical analysis in single-subject designs presents a challenge, as analytical methods that are applied to group comparison studies are often not appropriate in single-subject research. "SSD for R" is a robust set of statistical functions with wide applicability to single-subject research. It is a comprehensive package…
ERIC Educational Resources Information Center
Kalandadze, Tamar; Norbury, Courtenay; Naerland, Terje; Naess, Kari-Anne B.
2018-01-01
We present a meta-analysis of studies that compare figurative language comprehension in individuals with autism spectrum disorder and in typically developing controls who were matched based on chronological age or/and language ability. A total of 41 studies and 45 independent effect sizes were included based on predetermined inclusion criteria.…
Selective versus comprehensive emergency management in Korea.
Ha, Kyoo-Man; Oh, Hyeon-Mun
2014-01-01
In spite of Korean governments' efforts, many emergency management practitioners wonder whether what is actually being practiced is selective or comprehensive management. Using a qualitative content analysis and experiences in practice, the article analyzes the barriers to selective emergency management and the paths to comprehensive emergency management via the same three management elements: stakeholders, phases of the emergency management lifecycle, and hazards and impacts. Four analytical levels are considered: central government level, industry level, community level, and household level. Korea, despite its self-praise, has to transform its selective emergency management into comprehensive emergency management in time.
NASA Astrophysics Data System (ADS)
Bellini, Anna
Customer-driven product customization and continued demand for cost and time savings have generated a renewed interest in agile manufacturing based on improvements on Rapid Prototyping (RP) technologies. The advantages of RP technologies are: (1) ability to shorten the product design and development time, (2) suitability for automation and decrease in the level of human intervention, (3) ability to build many geometrically complex shapes. A shift from "prototyping" to "manufacturing" necessitates the following improvements: (1) Flexibility in choice of materials; (2) Part integrity and built-in characteristics to meet performance requirements; (3) Dimensional stability and tolerances; (4) Improved surface finish. A project funded by ONR has been undertaken to develop an agile manufacturing technology for fabrication of ceramic and multi-component parts to meet various needs of the Navy, such as transducers, etc. The project is based on adaptation of a layered manufacturing concept since the program required that the new technology be developed based on a commercially available RP technology. Among various RP technologies available today, Fused Deposition Modeling (FDM) has been identified as the focus of this research because of its potential versatility in the choice of materials and deposition configuration. This innovative approach allows for designing and implementing highly complex internal architectures into parts through deposition of different materials in a variety of configurations in such a way that the finished product exhibit characteristics to meet the performance requirements. This implies that, in principle, one can tailor-make the assemble of materials and structures as per specifications of an optimum design. The program objectives can be achieved only through accurate process modeling and modeling of material behavior. Oftentimes, process modeling is based on some type of computational approach where as modeling of material behavior is based on extensive experimental investigations. Studies are conducted in the following categories: (1) Flow modeling during extrusion and deposition; (2) Thermal modeling; (3) Flow control during deposition; (4) Product characterization and property determination for dimensional analysis; (5) Development of a novel technology based on a mini-extrusion system. Studies in each of these stages have involved experimental as well as analytical approaches to develop a comprehensive modeling.
Kwan, Paul; Welch, Mitchell
2017-01-01
In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops. PMID:28875085
Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell
2017-01-01
In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.
Analysis of the Effects of Surface Pitting and Wear on the Vibrations of a Gear Transmission System
NASA Technical Reports Server (NTRS)
Choy, F. K.; Polyshchuk, V.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.
1994-01-01
A comprehensive procedure to simulate and analyze the vibrations in a gear transmission system with surface pitting, 'wear' and partial tooth fracture of the gear teeth is presented. An analytical model was developed where the effects of surface pitting and wear of the gear tooth were simulated by phase and magnitude changes in the gear mesh stiffness. Changes in the gear mesh stiffness were incorporated into each gear-shaft model during the global dynamic simulation of the system. The overall dynamics of the system were evaluated by solving for the transient dynamics of each shaft system simultaneously with the vibration of the gearbox structure. In order to reduce the number of degrees-of-freedom in the system, a modal synthesis procedure was used in the global transient dynamic analysis of the overall transmission system. An FFT procedure was used to transform the averaged time signal into the frequency domain for signature analysis. In addition, the Wigner-Ville distribution was also introduced to examine the gear vibration in the joint time frequency domain for vibration pattern recognition. Experimental results obtained from a gear fatigue test rig at NASA Lewis Research Center were used to evaluate the analytical model.
Birkley, Erica; Eckhardt, Christopher I.
2015-01-01
Prior reviews have identified elevated trait anger as a risk factor for intimate partner violence (IPV) perpetration. Given that 10 years have passed since the last comprehensive review of this literature, we provide an updated meta-analytic review examining associations among anger, hostility, internalizing negative emotions, and IPV for male and female perpetrators. One hundred and five effect sizes from 64 independent samples (61 studies) were included for analysis. IPV perpetration was moderately associated with the constructs of anger, hostility, and internalizing negative emotions. This association appeared stronger for those who perpetrated moderate to severe IPV compared to those who perpetrated low to moderate IPV, and did not vary across perpetrator sex, measurement method, relationship type, or perpetrator population. Implications and limitations of findings were reviewed in the context of theoretical models of IPV, and future directions for empirical and clinical endeavors were proposed. PMID:25752947
Birkley, Erica L; Eckhardt, Christopher I
2015-04-01
Prior reviews have identified elevated trait anger as a risk factor for intimate partner violence (IPV) perpetration. Given that 10 years have passed since the last comprehensive review of this literature, we provide an updated meta-analytic review examining associations among anger, hostility, internalizing negative emotions, and IPV for male and female perpetrators. One hundred and five effect sizes from 64 independent samples (61 studies) were included for analysis. IPV perpetration was moderately associated with the constructs of anger, hostility, and internalizing negative emotions. This association appeared stronger for those who perpetrated moderate to severe IPV compared to those who perpetrated low to moderate IPV, and did not vary across perpetrator sex, measurement method, relationship type, or perpetrator population. Implications and limitations of findings were reviewed in the context of theoretical models of IPV, and future directions for empirical and clinical endeavors were proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of a robust space power system decision model
NASA Astrophysics Data System (ADS)
Chew, Gilbert; Pelaccio, Dennis G.; Jacobs, Mark; Stancati, Michael; Cataldo, Robert
2001-02-01
NASA continues to evaluate power systems to support human exploration of the Moon and Mars. The system(s) would address all power needs of surface bases and on-board power for space transfer vehicles. Prior studies have examined both solar and nuclear-based alternatives with respect to individual issues such as sizing or cost. What has not been addressed is a comprehensive look at the risks and benefits of the options that could serve as the analytical framework to support a system choice that best serves the needs of the exploration program. This paper describes the SAIC developed Space Power System Decision Model, which uses a formal Two-step Analytical Hierarchy Process (TAHP) methodology that is used in the decision-making process to clearly distinguish candidate power systems in terms of benefits, safety, and risk. TAHP is a decision making process based on the Analytical Hierarchy Process, which employs a hierarchic approach of structuring decision factors by weights, and relatively ranks system design options on a consistent basis. This decision process also includes a level of data gathering and organization that produces a consistent, well-documented assessment, from which the capability of each power system option to meet top-level goals can be prioritized. The model defined on this effort focuses on the comparative assessment candidate power system options for Mars surface application(s). This paper describes the principles of this approach, the assessment criteria and weighting procedures, and the tools to capture and assess the expert knowledge associated with space power system evaluation. .
Effective Wettability of Heterogenous Fracture Surfaces Using the Lattice-Boltzmann Method
NASA Astrophysics Data System (ADS)
E Santos, J.; Prodanovic, M.; Landry, C. J.
2017-12-01
Fracture walls in the subsurface are often structured by minerals of different composition (potentially further altered in contact with fluids during hydrocarbon extraction or CO2 sequestration), this yields in a heterogeneous wettability of the surface in contact with the fluids. The focus of our work is to study how surfaces presenting different mineralogy and roughness affect multiphase flow in fractures. Using the Shan-Chen model of the lattice-Boltzmann method (LBM) we define fluid interaction and surface attraction parameters to simulate a system of a wetting and a non-wetting fluid. In this work, we use synthetically created fractures presenting different arrangements of wetting and non-wetting patches, and with or without roughness; representative of different mineralogy, similar workflow can be applied to fractures extracted from X-ray microtomography images of fractures porous media. The results from the LBM simulations provide an insight on how the distribution of mineralogy and surface roughness are related with the observed macroscopic contact angle. We present a comparison between the published analytical models, and our results based on surface areas, spatial distribution and local fracture aperture. The understanding of the variables that affect the contact angle is useful for the comprehension of multiphase processes in naturally fractured reservoirs like primary oil production, enhanced oil recovery and CO2 sequestration. The macroscopic contact angle analytical equations for heterogeneous surfaces with variable roughness are no longer valid in highly heterogeneous systems; we quantify the difference thus offering an alternative to analytical models.
Competing opinion diffusion on social networks.
Hu, Haibo
2017-11-01
Opinion competition is a common phenomenon in real life, such as with opinions on controversial issues or political candidates; however, modelling this competition remains largely unexplored. To bridge this gap, we propose a model of competing opinion diffusion on social networks taking into account degree-dependent fitness or persuasiveness. We study the combined influence of social networks, individual fitnesses and attributes, as well as mass media on people's opinions, and find that both social networks and mass media act as amplifiers in opinion diffusion, the amplifying effect of which can be quantitatively characterized. We analytically obtain the probability that each opinion will ultimately pervade the whole society when there are no committed people in networks, and the final proportion of each opinion at the steady state when there are committed people in networks. The results of numerical simulations show good agreement with those obtained through an analytical approach. This study provides insight into the collective influence of individual attributes, local social networks and global media on opinion diffusion, and contributes to a comprehensive understanding of competing diffusion behaviours in the real world.
Competing opinion diffusion on social networks
2017-01-01
Opinion competition is a common phenomenon in real life, such as with opinions on controversial issues or political candidates; however, modelling this competition remains largely unexplored. To bridge this gap, we propose a model of competing opinion diffusion on social networks taking into account degree-dependent fitness or persuasiveness. We study the combined influence of social networks, individual fitnesses and attributes, as well as mass media on people’s opinions, and find that both social networks and mass media act as amplifiers in opinion diffusion, the amplifying effect of which can be quantitatively characterized. We analytically obtain the probability that each opinion will ultimately pervade the whole society when there are no committed people in networks, and the final proportion of each opinion at the steady state when there are committed people in networks. The results of numerical simulations show good agreement with those obtained through an analytical approach. This study provides insight into the collective influence of individual attributes, local social networks and global media on opinion diffusion, and contributes to a comprehensive understanding of competing diffusion behaviours in the real world. PMID:29291101
A Framework for Understanding Physics Students' Computational Modeling Practices
NASA Astrophysics Data System (ADS)
Lunk, Brandon Robert
With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by their existing physics content knowledge, particularly their knowledge of analytic procedures. While this existing knowledge was often applied in inappropriate circumstances, the students were still able to display a considerable amount of understanding of the physics content and of analytic solution procedures. These observations could not be adequately accommodated by the existing literature of programming comprehension. In extending the resource framework to the task of computational modeling, I model students' practices in terms of three important elements. First, a knowledge base includes re- sources for understanding physics, math, and programming structures. Second, a mechanism for monitoring and control describes students' expectations as being directed towards numerical, analytic, qualitative or rote solution approaches and which can be influenced by the problem representation. Third, a set of solution approaches---many of which were identified in this study---describe what aspects of the knowledge base students use and how they use that knowledge to enact their expectations. This framework allows us as researchers to track student discussions and pinpoint the source of difficulties. This work opens up many avenues of potential research. First, this framework gives researchers a vocabulary for extending Resource Theory to other domains of instruction, such as modeling how physics students use graphs. Second, this framework can be used as the basis for modeling expert physicists' programming practices. Important instructional implications also follow from this research. Namely, as we broaden the use of computational modeling in the physics classroom, our instructional practices should focus on helping students understand the step-by-step nature of programming in contrast to the already salient analytic procedures.
ERIC Educational Resources Information Center
Lotfipour-Saedi, Kazem
2015-01-01
This paper represents some suggestions towards discourse-analytic approaches for ESL/EFL education, with the focus on identifying the textual forms which can contribute to the textual difficulty. Textual difficulty/comprehensibility, rather than being purely text-based or reader-dependent, is certainly a matter of interaction between text and…
A quantitative, comprehensive analytical model for ``fast'' magnetic reconnection in Hall MHD
NASA Astrophysics Data System (ADS)
Simakov, Andrei N.
2008-11-01
Magnetic reconnection in nature usually happens on fast (e.g. dissipation independent) time scales. While such scales have been observed computationally [1], a fundamental analytical model capable of explaining them has been lacking. Here, we propose such a quantitative model for 2D Hall MHD reconnection without a guide field. The model recovers the Sweet-Parker and the electron MHD [2] results in the appropriate limits of the ion inertial length, di, and is valid everywhere in between [3]. The model predicts the dissipation region aspect ratio and the reconnection rate Ez in terms of dissipation and inertial parameters, and has been found to be in excellent agreement with non-linear simulations. It confirms a number of long-standing empirical results and resolves several controversies. In particular, we find that both open X-point and elongated dissipation regions allow ``fast'' reconnection and that Ez depends on di. Moreover, when applied to electron-positron plasmas, the model demonstrates that fast dispersive waves are not instrumental for ``fast'' reconnection [4]. [1] J. Birn et al., J. Geophys. Res. 106, 3715 (2001). [2] L. Chac'on, A. N. Simakov, and A. Zocco, Phys. Rev. Lett. 99, 235001 (2007). [3] A. N. Simakov and L. Chac'on, submitted to Phys. Rev. Lett. [4] L. Chac'on, A. N. Simakov, V. Lukin, and A. Zocco, Phys. Rev. Lett. 101, 025003 (2008).
Recognition and source memory as multivariate decision processes.
Banks, W P
2000-07-01
Recognition memory, source memory, and exclusion performance are three important domains of study in memory, each with its own findings, it specific theoretical developments, and its separate research literature. It is proposed here that results from all three domains can be treated with a single analytic model. This article shows how to generate a comprehensive memory representation based on multidimensional signal detection theory and how to make predictions for each of these paradigms using decision axes drawn through the space. The detection model is simpler than the comparable multinomial model, it is more easily generalizable, and it does not make threshold assumptions. An experiment using the same memory set for all three tasks demonstrates the analysis and tests the model. The results show that some seemingly complex relations between the paradigms derive from an underlying simplicity of structure.
Models for the effects of host movement in vector-borne disease systems.
Cosner, Chris
2015-12-01
Host and/or vector movement patterns have been shown to have significant effects in both empirical studies and mathematical models of vector-borne diseases. The processes of economic development and globalization seem likely to make host movement even more important in the future. This article is a brief survey of some of the approaches that have been used to study the effects of host movement in analytic mathematical models for vector-borne diseases. It describes the formulation and interpretation of various types of spatial models and describes a few of the conclusions that can be drawn from them. It is not intended to be comprehensive but rather to provide sufficient background material and references to the literature to serve as an entry point into this area of research for interested readers. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wu, Linqin; Xu, Sheng; Jiang, Dezhi
2015-12-01
Industrial wireless networked control system has been widely used, and how to evaluate the performance of the wireless network is of great significance. In this paper, considering the shortcoming of the existing performance evaluation methods, a comprehensive performance evaluation method of networks multi-indexes fuzzy analytic hierarchy process (MFAHP) combined with the fuzzy mathematics and the traditional analytic hierarchy process (AHP) is presented. The method can overcome that the performance evaluation is not comprehensive and subjective. Experiments show that the method can reflect the network performance of real condition. It has direct guiding role on protocol selection, network cabling, and node setting, and can meet the requirements of different occasions by modifying the underlying parameters.
Ahmadzadeh, Arman; Arjmandi, Hamidreza; Burkovski, Andreas; Schober, Robert
2016-10-01
This paper studies the problem of receiver modeling in molecular communication systems. We consider the diffusive molecular communication channel between a transmitter nano-machine and a receiver nano-machine in a fluid environment. The information molecules released by the transmitter nano-machine into the environment can degrade in the channel via a first-order degradation reaction and those that reach the receiver nano-machine can participate in a reversible bimolecular reaction with receiver receptor proteins. Thereby, we distinguish between two scenarios. In the first scenario, we assume that the entire surface of the receiver is covered by receptor molecules. We derive a closed-form analytical expression for the expected received signal at the receiver, i.e., the expected number of activated receptors on the surface of the receiver. Then, in the second scenario, we consider the case where the number of receptor molecules is finite and the uniformly distributed receptor molecules cover the receiver surface only partially. We show that the expected received signal for this scenario can be accurately approximated by the expected received signal for the first scenario after appropriately modifying the forward reaction rate constant. The accuracy of the derived analytical results is verified by Brownian motion particle-based simulations of the considered environment, where we also show the impact of the effect of receptor occupancy on the derived analytical results.
Theory and design of variable conductance heat pipes
NASA Technical Reports Server (NTRS)
Marcus, B. D.
1972-01-01
A comprehensive review and analysis of all aspects of heat pipe technology pertinent to the design of self-controlled, variable conductance devices for spacecraft thermal control is presented. Subjects considered include hydrostatics, hydrodynamics, heat transfer into and out of the pipe, fluid selection, materials compatibility and variable conductance control techniques. The report includes a selected bibliography of pertinent literature, analytical formulations of various models and theories describing variable conductance heat pipe behavior, and the results of numerous experiments on the steady state and transient performance of gas controlled variable conductance heat pipes. Also included is a discussion of VCHP design techniques.
Ball Bearing Analysis with the ORBIS Tool
NASA Technical Reports Server (NTRS)
Halpin, Jacob D.
2016-01-01
Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.
OCCIMA: Optical Channel Characterization in Maritime Atmospheres
NASA Astrophysics Data System (ADS)
Hammel, Steve; Tsintikidis, Dimitri; deGrassie, John; Reinhardt, Colin; McBryde, Kevin; Hallenborg, Eric; Wayne, David; Gibson, Kristofor; Cauble, Galen; Ascencio, Ana; Rudiger, Joshua
2015-05-01
The Navy is actively developing diverse optical application areas, including high-energy laser weapons and free- space optical communications, which depend on an accurate and timely knowledge of the state of the atmospheric channel. The Optical Channel Characterization in Maritime Atmospheres (OCCIMA) project is a comprehensive program to coalesce and extend the current capability to characterize the maritime atmosphere for all optical and infrared wavelengths. The program goal is the development of a unified and validated analysis toolbox. The foundational design for this program coordinates the development of sensors, measurement protocols, analytical models, and basic physics necessary to fulfill this goal.
Biologically inspired technologies using artificial muscles
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2005-01-01
One of the newest fields of biomimetics is the electroactive polymers (EAP) that are also known as artificial muscles. To take advantage of these materials, efforts are made worldwide to establish a strong infrastructure addressing the need for comprehensive analytical modeling of their response mechanism and develop effective processing and characterization techniques. The field is still in its emerging state and robust materials are still not readily available however in recent years significant progress has been made and commercial products have already started to appear. This paper covers the current state of- the-art and challenges to making artificial muscles and their potential biomimetic applications.
New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric W; Rames, Clement L; Muratori, Matteo
This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.
Numerical Modeling of Ablation Heat Transfer
NASA Technical Reports Server (NTRS)
Ewing, Mark E.; Laker, Travis S.; Walker, David T.
2013-01-01
A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.
Evolution of an Implementation-Ready Interprofessional Pain Assessment Reference Model
Collins, Sarah A; Bavuso, Karen; Swenson, Mary; Suchecki, Christine; Mar, Perry; Rocha, Roberto A.
2017-01-01
Standards to increase consistency of comprehensive pain assessments are important for safety, quality, and analytics activities, including meeting Joint Commission requirements and learning the best management strategies and interventions for the current prescription Opioid epidemic. In this study we describe the development and validation of a Pain Assessment Reference Model ready for implementation on EHR forms and flowsheets. Our process resulted in 5 successive revisions of the reference model, which more than doubled the number of data elements to 47. The organization of the model evolved during validation sessions with panels totaling 48 subject matter experts (SMEs) to include 9 sets of data elements, with one set recommended as a minimal data set. The reference model also evolved when implemented into EHR forms and flowsheets, indicating specifications such as cascading logic that are important to inform secondary use of data. PMID:29854125
Ahmad, Rafiq; Tripathy, Nirmalya; Park, Jin-Ho; Hahn, Yoon-Bong
2015-08-04
We report a novel straightforward approach for simultaneous and highly-selective detection of multi-analytes (i.e. glucose, cholesterol and urea) using an integrated field-effect transistor (i-FET) array biosensor without any interference in each sensor response. Compared to analytically-measured data, performance of the ZnO nanorod based i-FET array biosensor is found to be highly reliable for rapid detection of multi-analytes in mice blood, and serum and blood samples of diabetic dogs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Yiying, E-mail: yiyingyan@sjtu.edu.cn; Lü, Zhiguo, E-mail: zglv@sjtu.edu.cn; Zheng, Hang, E-mail: hzheng@sjtu.edu.cn
We present a theoretical formalism for resonance fluorescence radiating from a two-level system (TLS) driven by any periodic driving and coupled to multiple reservoirs. The formalism is derived analytically based on the combination of Floquet theory and Born–Markov master equation. The formalism allows us to calculate the spectrum when the Floquet states and quasienergies are analytically or numerically solved for simple or complicated driving fields. We can systematically explore the spectral features by implementing the present formalism. To exemplify this theory, we apply the unified formalism to comprehensively study a generic model that a harmonically driven TLS is simultaneously coupledmore » to a radiative reservoir and a dephasing reservoir. We demonstrate that the significant features of the fluorescence spectra, the driving-induced asymmetry and the dephasing-induced asymmetry, can be attributed to the violation of detailed balance condition, and explained in terms of the driving-related transition quantities between Floquet-states and their steady populations. In addition, we find the distinguished features of the fluorescence spectra under the biharmonic and multiharmonic driving fields in contrast with that of the harmonic driving case. In the case of the biharmonic driving, we find that the spectra are significantly different from the result of the RWA under the multiple resonance conditions. By the three concrete applications, we illustrate that the present formalism provides a routine tool for comprehensively exploring the fluorescence spectrum of periodically strongly driven TLSs.« less
Crock, J.G.; Smith, D.B.; Yager, T.J.B.; Berry, C.J.; Adams, M.G.
2011-01-01
Since late 1993, Metro Wastewater Reclamation District of Denver (Metro District), a large wastewater treatment plant in Denver, Colo., has applied Grade I, Class B biosolids to about 52,000 acres of nonirrigated farmland and rangeland near Deer Trail, Colo., U.S.A. In cooperation with the Metro District in 1993, the U.S. Geological Survey (USGS) began monitoring groundwater at part of this site. In 1999, the USGS began a more comprehensive monitoring study of the entire site to address stakeholder concerns about the potential chemical effects of biosolids applications to water, soil, and vegetation. This more comprehensive monitoring program was recently extended through the end of 2010 and is now completed. Monitoring components of the more comprehensive study include biosolids collected at the wastewater treatment plant, soil, crops, dust, alluvial and bedrock groundwater, and stream-bed sediment. Streams at the site are dry most of the year, so samples of stream-bed sediment deposited after rain were used to indicate surface-water runoff effects. This report summarizes analytical results for the biosolids samples collected at the Metro District wastewater treatment plant in Denver and analyzed for 2010. In general, the objective of each component of the study was to determine whether concentrations of nine trace elements ("priority analytes") (1) were higher than regulatory limits, (2) were increasing with time, or (3) were significantly higher in biosolids-applied areas than in a similar farmed area where biosolids were not applied (background). Previous analytical results indicate that the elemental composition of biosolids from the Denver plant was consistent during 1999-2009, and this consistency continues with the samples for 2010. Total concentrations of regulated trace elements remain consistently lower than the regulatory limits for the entire monitoring period. Concentrations of none of the priority analytes appear to have increased during the 12 years of this study.
Crock, J.G.; Smith, D.B.; Yager, T.J.B.; Berry, C.J.; Adams, M.G.
2010-01-01
Since late 1993, Metro Wastewater Reclamation District of Denver, a large wastewater treatment plant in Denver, Colo., has applied Grade I, Class B biosolids to about 52,000 acres of nonirrigated farmland and rangeland near Deer Trail, Colo., U.S.A. In cooperation with the Metro District in 1993, the U.S. Geological Survey began monitoring groundwater at part of this site. In 1999, the Survey began a more comprehensive monitoring study of the entire site to address stakeholder concerns about the potential chemical effects of biosolids applications to water, soil, and vegetation. This more comprehensive monitoring program has recently been extended through the end of 2010. Monitoring components of the more comprehensive study include biosolids collected at the wastewater treatment plant, soil, crops, dust, alluvial and bedrock groundwater, and stream-bed sediment. Streams at the site are dry most of the year, so samples of stream-bed sediment deposited after rain were used to indicate surface-water effects. This report presents analytical results for the biosolids samples collected at the Metro District wastewater treatment plant in Denver and analyzed for 2009. In general, the objective of each component of the study was to determine whether concentrations of nine trace elements ('priority analytes') (1) were higher than regulatory limits, (2) were increasing with time, or (3) were significantly higher in biosolids-applied areas than in a similar farmed area where biosolids were not applied. Previous analytical results indicate that the elemental composition of biosolids from the Denver plant was consistent during 1999-2008, and this consistency continues with the samples for 2009. Total concentrations of regulated trace elements remain consistently lower than the regulatory limits for the entire monitoring period. Concentrations of none of the priority analytes appear to have increased during the 11 years of this study.
NASA Astrophysics Data System (ADS)
Liu, Yuanyuan; Peng, Yankun; Zhang, Leilei; Dhakal, Sagar; Wang, Caiping
2014-05-01
Pork is one of the highly consumed meat item in the world. With growing improvement of living standard, concerned stakeholders including consumers and regulatory body pay more attention to comprehensive quality of fresh pork. Different analytical-laboratory based technologies exist to determine quality attributes of pork. However, none of the technologies are able to meet industrial desire of rapid and non-destructive technological development. Current study used optical instrument as a rapid and non-destructive tool to classify 24 h-aged pork longissimus dorsi samples into three kinds of meat (PSE, Normal and DFD), on the basis of color L* and pH24. Total of 66 samples were used in the experiment. Optical system based on Vis/NIR spectral acquisition system (300-1100 nm) was self- developed in laboratory to acquire spectral signal of pork samples. Median smoothing filter (M-filter) and multiplication scatter correction (MSC) was used to remove spectral noise and signal drift. Support vector machine (SVM) prediction model was developed to classify the samples based on their comprehensive qualities. The results showed that the classification model is highly correlated with the actual quality parameters with classification accuracy more than 85%. The system developed in this study being simple and easy to use, results being promising, the system can be used in meat processing industry for real time, non-destructive and rapid detection of pork qualities in future.
A Research on Performance Measurement Based on Economic Valued-Added Comprehensive Scorecard
NASA Astrophysics Data System (ADS)
Chen, Qin; Zhang, Xiaomei
With the development of economic, the traditional performance mainly rely on financial indicators could not satisfy the need of work. In order to make the performance measurement taking the best services for business goals, this paper proposed Economic Valued-Added Comprehensive Scorecard based on research of shortages and advantages of EVA and BSC .We used Analytic Hierarchy Process to build matrix to solve the weighting of EVA Comprehensive Scorecard. At last we could find the most influence factors for enterprise value forming the weighting.
Zhou, Haibo; Ying, Hao
2017-09-01
A conventional controller's explicit input-output mathematical relationship, also known as its analytical structure, is always available for analysis and design of a control system. In contrast, virtually all type-2 (T2) fuzzy controllers are treated as black-box controllers in the literature in that their analytical structures are unknown, which inhibits precise and comprehensive understanding and analysis. In this regard, a long-standing fundamental issue remains unresolved: how a T2 fuzzy set's footprint of uncertainty, a key element differentiating a T2 controller from a type-1 (T1) controller, affects a controller's analytical structure. In this paper, we describe an innovative technique for deriving analytical structures of a class of typical interval T2 (IT2) TS fuzzy controllers. This technique makes it possible to analyze the analytical structures of the controllers to reveal the role of footprints of uncertainty in shaping the structures. Specifically, we have mathematically proven that under certain conditions, the larger the footprints, the more the IT2 controllers resemble linear or piecewise linear controllers. When the footprints are at their maximum, the IT2 controllers actually become linear or piecewise linear controllers. That is to say the smaller the footprints, the more nonlinear the controllers. The most nonlinear IT2 controllers are attained at zero footprints, at which point they become T1 controllers. This finding implies that sometimes if strong nonlinearity is most important and desired, one should consider using a smaller footprint or even just a T1 fuzzy controller. This paper exemplifies the importance and value of the analytical structure approach for comprehensive analysis of T2 fuzzy controllers.
[The discussion of the infiltrative model of mathematical knowledge to genetics teaching].
Liu, Jun; Luo, Pei-Gao
2011-11-01
Genetics, the core course of biological field, is an importance major-basic course in curriculum of many majors related with biology. Due to strong theoretical and practical as well as abstract of genetics, it is too difficult to study on genetics for many students. At the same time, mathematics is one of the basic courses in curriculum of the major related natural science, which has close relationship with the establishment, development and modification of genetics. In this paper, to establish the intrinsic logistic relationship and construct the integral knowledge network and to help students improving the analytic, comprehensive and logistic abilities, we applied some mathematical infiltrative model genetic knowledge in genetics teaching, which could help students more deeply learn and understand genetic knowledge.
Rowley, Dane A; Rogish, Miles; Alexander, Timothy; Riggs, Kevin J
2017-01-01
Effective pragmatic comprehension of language is critical for successful communication and interaction, but this ability is routinely impaired following Traumatic Brain Injury (TBI) (1,2). Individual studies have investigated the cognitive domains associated with impaired pragmatic comprehension, but there remains little understanding of the relative importance of these domains in contributing to pragmatic comprehension impairment following TBI. This paper presents a systematic meta-analytic review of the observed correlations between pragmatic comprehension and cognitive processes following TBI. Five meta-analyses were computed, which quantified the relationship between pragmatic comprehension and five key cognitive constructs (declarative memory; working memory; attention; executive functions; social cognition). Significant moderate-to-strong correlations were found between all cognitive measures and pragmatic comprehension, where declarative memory was the strongest correlate. Thus, our findings indicate that pragmatic comprehension in TBI is associated with an array of domain general cognitive processes, and as such deficits in these cognitive domains may underlie pragmatic comprehension difficulties following TBI. The clinical implications of these findings are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.
2013-07-15
Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less
Particle Engulfment and Pushing By Solidifying Interfaces
NASA Technical Reports Server (NTRS)
2003-01-01
The study of particle behavior at solid/liquid interfaces (SLI s) is at the center of the Particle Engulfment and Pushing (PEP) research program. Interactions of particles with SLI s have been of interest since the 1960 s, starting with geological observations, i.e., frost heaving. Ever since, this field of research has become significant to such diverse areas as metal matrix composite materials, fabrication of superconductors, and inclusion control in steels. The PEP research effort is geared towards understanding the fundamental physics of the interaction between particles and a planar SLI. Experimental work including 1-g and mu-g experiments accompany the development of analytical and numerical models. The experimental work comprised of substantial groundwork with aluminum (Al) and zinc (Zn) matrices containing spherical zirconia particles, mu-g experiments with metallic Al matrices and the use of transparent organic metal-analogue materials. The modeling efforts have grown from the initial steady-state analytical model to dynamic models, accounting for the initial acceleration of a particle at rest by an advancing SLI. To gain a more comprehensive understanding, numerical models were developed to account for the influence of the thermal and solutal field. Current efforts are geared towards coupling the diffusive 2-D front tracking model with a fluid flow model to account for differences in the physics of interaction between 1-g and -g environments. A significant amount of this theoretical investigation has been and is being performed by co-investigators at NASA MSFC.
New insights into liquid chromatography for more eco-friendly analysis of pharmaceuticals.
Shaaban, Heba
2016-10-01
Greening the analytical methods used for analysis of pharmaceuticals has been receiving great interest aimed at eliminating or minimizing the amount of organic solvents consumed daily worldwide without loss in chromatographic performance. Traditional analytical LC techniques employed in pharmaceutical analysis consume tremendous amounts of hazardous solvents and consequently generate large amounts of waste. The monetary and ecological impact of using large amounts of solvents and waste disposal motivated the analytical community to search for alternatives to replace polluting analytical methodologies with clean ones. In this context, implementing the principles of green analytical chemistry (GAC) in analytical laboratories is highly desired. This review gives a comprehensive overview on different green LC pathways for implementing GAC principles in analytical laboratories and focuses on evaluating the greenness of LC analytical procedures. This review presents green LC approaches for eco-friendly analysis of pharmaceuticals in industrial, biological, and environmental matrices. Graphical Abstract Green pathways of liquid chromatography for more eco-friendly analysis of pharmaceuticals.
Chau, Hong Thi Cam; Kadokami, Kiwao; Ifuku, Tomomi; Yoshida, Yusuke
2017-12-01
A comprehensive screening method for 311 organic compounds with a wide range of physicochemical properties (log Pow -2.2-8.53) in water samples was developed by combining solid-phase extraction with liquid chromatography-high-resolution time-of-flight mass spectrometry. Method optimization using 128 pesticides revealed that tandem extraction with styrene-divinylbenzene polymer and activated carbon solid-phase extraction cartridges at pH 7.0 was optimal. The developed screening method was able to extract 190 model compounds with average recovery of 80.8% and average relative standard deviations (RSD) of 13.5% from spiked reagent water at 0.20 μg L -1 , and 87.1% recovery and 10.8% RSD at 0.05 μg L -1 . Spike-recovery testing (0.20 μg L -1 ) using real sewage treatment plant effluents resulted in an average recovery and average RSD of 190 model compounds of 77.4 and 13.1%, respectively. The method was applied to the influent and effluent of five sewage treatment plants in Kitakyushu, Japan, with 29 out of 311 analytes being observed at least once. The results showed that this method can screen for a large number of chemicals with a wide range of physicochemical properties quickly and at low operational cost, something that is difficult to achieve using conventional analytical methods. This method will find utility in target screening of hazardous chemicals with a high risk in environmental waters, and for confirming the safety of water after environmental incidents.
Consistent Partial Least Squares Path Modeling via Regularization.
Jung, Sunho; Park, JaeHong
2018-01-01
Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.
Zechmeister-Koss, Ingrid; Schnell-Inderst, Petra; Zauner, Günther
2014-04-01
An increasing number of evidence sources are relevant for populating decision analytic models. What is needed is detailed methodological advice on which type of data is to be used for what type of model parameter. We aim to identify standards in health technology assessment manuals and economic (modeling) guidelines on appropriate evidence sources and on the role different types of data play within a model. Documents were identified via a call among members of the International Network of Agencies for Health Technology Assessment and by hand search. We included documents from Europe, the United States, Canada, Australia, and New Zealand as well as transnational guidelines written in English or German. We systematically summarized in a narrative manner information on appropriate evidence sources for model parameters, their advantages and limitations, data identification methods, and data quality issues. A large variety of evidence sources for populating models are mentioned in the 28 documents included. They comprise research- and non-research-based sources. Valid and less appropriate sources are identified for informing different types of model parameters, such as clinical effect size, natural history of disease, resource use, unit costs, and health state utility values. Guidelines do not provide structured and detailed advice on this issue. The article does not include information from guidelines in languages other than English or German, and the information is not tailored to specific modeling techniques. The usability of guidelines and manuals for modeling could be improved by addressing the issue of evidence sources in a more structured and comprehensive format.
Simón, Lorena; Boldo, Elena; Ortiz, Cristina; Fernández-Cuenca, Rafael; Linares, Cristina; Medrano, María José; Pastor-Barriuso, Roberto
2017-01-01
Background Existing evidence on the effects of smoke-free policies on respiratory diseases is scarce and inconclusive. Spain enacted two consecutive smoke-free regulations: a partial ban in 2006 and a comprehensive ban in 2011. We estimated their impact on hospital admissions via emergency departments for chronic obstructive pulmonary disease (COPD) and asthma. Methods Data for COPD (ICD-9 490–492, 494–496) came from 2003–2012 hospital admission records from the fourteen largest provinces of Spain and from five provinces for asthma (ICD-9 493). We estimated changes in hospital admission rates within provinces using Poisson additive models adjusted for long-term linear trends and seasonality, day of the week, temperature, influenza, acute respiratory infections, and pollen counts (asthma models). We estimated immediate and gradual effects through segmented-linear models. The coefficients within each province were combined through random-effects multivariate meta-analytic models. Results The partial ban was associated with a strong significant pooled immediate decline in COPD-related admission rates (14.7%, 95%CI: 5.0, 23.4), sustained over time with a one-year decrease of 13.6% (95%CI: 2.9, 23.1). The association was consistent across age and sex groups but stronger in less economically developed Spanish provinces. Asthma-related admission rates decreased by 7.4% (95%CI: 0.2, 14.2) immediately after the comprehensive ban was implemented, although the one-year decrease was sustained only among men (9.9%, 95%CI: 3.9, 15.6). Conclusions The partial ban was associated with an immediate and sustained strong decline in COPD-related admissions, especially in less economically developed provinces. The comprehensive ban was related to an immediate decrease in asthma, sustained for the medium-term only among men. PMID:28542337
Applications of reversible covalent chemistry in analytical sample preparation.
Siegel, David
2012-12-07
Reversible covalent chemistry (RCC) adds another dimension to commonly used sample preparation techniques like solid-phase extraction (SPE), solid-phase microextraction (SPME), molecular imprinted polymers (MIPs) or immuno-affinity cleanup (IAC): chemical selectivity. By selecting analytes according to their covalent reactivity, sample complexity can be reduced significantly, resulting in enhanced analytical performance for low-abundance target analytes. This review gives a comprehensive overview of the applications of RCC in analytical sample preparation. The major reactions covered include reversible boronic ester formation, thiol-disulfide exchange and reversible hydrazone formation, targeting analyte groups like diols (sugars, glycoproteins and glycopeptides, catechols), thiols (cysteinyl-proteins and cysteinyl-peptides) and carbonyls (carbonylated proteins, mycotoxins). Their applications range from low abundance proteomics to reversible protein/peptide labelling to antibody chromatography to quantitative and qualitative food analysis. In discussing the potential of RCC, a special focus is on the conditions and restrictions of the utilized reaction chemistry.
Light-emitting diodes for analytical chemistry.
Macka, Mirek; Piasecki, Tomasz; Dasgupta, Purnendu K
2014-01-01
Light-emitting diodes (LEDs) are playing increasingly important roles in analytical chemistry, from the final analysis stage to photoreactors for analyte conversion to actual fabrication of and incorporation in microdevices for analytical use. The extremely fast turn-on/off rates of LEDs have made possible simple approaches to fluorescence lifetime measurement. Although they are increasingly being used as detectors, their wavelength selectivity as detectors has rarely been exploited. From their first proposed use for absorbance measurement in 1970, LEDs have been used in analytical chemistry in too many ways to make a comprehensive review possible. Hence, we critically review here the more recent literature on their use in optical detection and measurement systems. Cloudy as our crystal ball may be, we express our views on the future applications of LEDs in analytical chemistry: The horizon will certainly become wider as LEDs in the deep UV with sufficient intensity become available.
Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling
Lucchese, Marcello; Borisenko, Oleg; Mantovani, Lorenzo Giovanni; Cortesi, Paolo Angelo; Cesana, Giancarlo; Adam, Daniel; Burdukova, Elisabeth; Lukyanov, Vasily; Di Lorenzo, Nicola
2017-01-01
Objective To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years) and a long-term (lifetime) horizon. Methods A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. Results In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs). Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. Conclusion In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis. PMID:28601866
NASA Astrophysics Data System (ADS)
Han, Shenchao; Yang, Yanchun; Liu, Yude; Zhang, Peng; Li, Siwei
2018-01-01
It is effective to reduce haze in winter by changing the distributed heat supply system. Thus, the studies on comprehensive index system and scientific evaluation method of distributed heat supply project are essential. Firstly, research the influence factors of heating modes, and an index system with multiple dimension including economic, environmental, risk and flexibility was built and all indexes were quantified. Secondly, a comprehensive evaluation method based on AHP was put forward to analyze the proposed multiple and comprehensive index system. Lastly, the case study suggested that supplying heat with electricity has great advantage and promotional value. The comprehensive index system of distributed heating supply project and evaluation method in this paper can evaluate distributed heat supply project effectively and provide scientific support for choosing the distributed heating project.
Comprehensive two-dimensional liquid chromatography for polyphenol analysis in foodstuffs.
Cacciola, Francesco; Farnetti, Sara; Dugo, Paola; Marriott, Philip John; Mondello, Luigi
2017-01-01
Polyphenols are a class of plant secondary metabolites that are recently drawing a special interest because of their broad spectrum of pharmacological effects. As they are characterized by an enormous structural variability, the identification of these molecules in food samples is a difficult task, and sometimes having only a limited number of commercially available reference materials is not of great help. One-dimensional liquid chromatography is the most widely applied analytical approach for their analysis. In particular, the hyphenation of liquid chromatography to mass spectrometry has come to play an influential role by allowing relatively fast tentative identification and accurate quantification of polyphenolic compounds at trace levels in vegetable media. However, when dealing with very complex real-world food samples, a single separation system often does not provide sufficient resolving power for attaining rewarding results. Comprehensive two-dimensional liquid chromatography is a technique of great analytical impact, since it offers much higher peak capacities than separations in a single dimension. In the present review, we describe applications in the field of comprehensive two-dimensional liquid chromatography for polyphenol analysis in real-world food samples. Comprehensive two-dimensional liquid chromatography applications to nonfood matrices fall outside the scope of the current report and will not be discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
A comprehensive one-dimensional numerical model for solute transport in rivers
NASA Astrophysics Data System (ADS)
Barati Moghaddam, Maryam; Mazaheri, Mehdi; MohammadVali Samani, Jamal
2017-01-01
One of the mechanisms that greatly affect the pollutant transport in rivers, especially in mountain streams, is the effect of transient storage zones. The main effect of these zones is to retain pollutants temporarily and then release them gradually. Transient storage zones indirectly influence all phenomena related to mass transport in rivers. This paper presents the TOASTS (third-order accuracy simulation of transient storage) model to simulate 1-D pollutant transport in rivers with irregular cross-sections under unsteady flow and transient storage zones. The proposed model was verified versus some analytical solutions and a 2-D hydrodynamic model. In addition, in order to demonstrate the model applicability, two hypothetical examples were designed and four sets of well-established frequently cited tracer study data were used. These cases cover different processes governing transport, cross-section types and flow regimes. The results of the TOASTS model, in comparison with two common contaminant transport models, shows better accuracy and numerical stability.
The forensic validity of visual analytics
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.
2008-01-01
The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.
Wieser, Stefan; Axmann, Markus; Schütz, Gerhard J.
2008-01-01
We propose here an approach for the analysis of single-molecule trajectories which is based on a comprehensive comparison of an experimental data set with multiple Monte Carlo simulations of the diffusion process. It allows quantitative data analysis, particularly whenever analytical treatment of a model is infeasible. Simulations are performed on a discrete parameter space and compared with the experimental results by a nonparametric statistical test. The method provides a matrix of p-values that assess the probability for having observed the experimental data at each setting of the model parameters. We show the testing approach for three typical situations observed in the cellular plasma membrane: i), free Brownian motion of the tracer, ii), hop diffusion of the tracer in a periodic meshwork of squares, and iii), transient binding of the tracer to slowly diffusing structures. By plotting the p-value as a function of the model parameters, one can easily identify the most consistent parameter settings but also recover mutual dependencies and ambiguities which are difficult to determine by standard fitting routines. Finally, we used the test to reanalyze previous data obtained on the diffusion of the glycosylphosphatidylinositol-protein CD59 in the plasma membrane of the human T24 cell line. PMID:18805933
NASA Astrophysics Data System (ADS)
Nguyen, Duc Anh; Cat Vu, Minh; Willems, Patrick; Monbaliu, Jaak
2017-04-01
Salt intrusion is the most acute problem for irrigation water quality in coastal regions during dry seasons. The use of numerical hydrodynamic models is widespread and has become the prevailing approach to simulate the salinity distribution in an estuary. Despite its power to estimate both spatial and temporal salinity variations along the estuary, this approach also has its drawbacks. The high computational cost and the need for detailed hydrological, bathymetric and tidal datasets, put some limits on the usability in particular case studies. In poor data environments, analytical salt intrusion models are more widely used as they require less data and have a further reduction of the computational effort. There are few studies however where a more comprehensive comparison is made between the performance of a numerical hydrodynamic and an analytical model. In this research the multi-channel Ma Estuary in Vietnam is considered as a case study. Both the analytical and the hydrodynamic simulation approaches have been applied and were found capable to mimic the longitudinal salt distribution along the estuary. The data to construct the MIKE11 model include observations provided by a network of fixed hydrological stations and the cross-section measurements along the estuary. The analytic model is developed in parallel but based on information obtained from the hydrological network only (typical for poor data environment). Note that the two convergence length parameters of this simplified model are usually extracted from topography data including cross-sectional area and width along the estuary. Furthermore, freshwater discharge data are needed but these are gauged further upstream outside of the tidal region and unable to reflect the individual flows entering the multi-channel estuary. In order to tackle the poor data environment limitations, a new approach was needed to calibrate the two estuary geometry parameters of the parsimonious salt intrusion model. Compared to the values based on a field survey for the estuary, the calibrated cross-sectional convergence length values are in very high agreement. By assuming a linear relation between inverses of the individual flows entering the estuary and inverses of the sum of flows gauged further upstream, the individual flows can be assessed. Evaluation on the modeling approaches at high water slack shows that the two modeling approaches have similar results. They explain salinity distribution along the Ma Estuary reasonably well with Nash-Sutcliffe efficiency values at gauging stations along the estuary of 0.50 or higher. These performances demonstrate the predictive power of the simplified salt intrusion model and of the proposed parameter/input estimation approach, even with the poorer data.
Analytical tools for the analysis of β-carotene and its degradation products
Stutz, H.; Bresgen, N.; Eckl, P. M.
2015-01-01
Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation. PMID:25867077
Simulations of thermionic suppression during tungsten transient melting experiments
NASA Astrophysics Data System (ADS)
Komm, M.; Tolias, P.; Ratynskaia, S.; Dejarnac, R.; Gunn, J. P.; Krieger, K.; Podolnik, A.; Pitts, R. A.; Panek, R.
2017-12-01
Plasma-facing components receive enormous heat fluxes under steady state and especially during transient conditions that can even lead to tungsten (W) melting. Under these conditions, the unimpeded thermionic current density emitted from the W surfaces can exceed the incident plasma current densities by several orders of magnitude triggering a replacement current which drives melt layer motion via the {\\boldsymbol{J}}× {\\boldsymbol{B}} force. However, in tokamaks, the thermionic current is suppressed by space-charge effects and prompt re-deposition due to gyro-rotation. We present comprehensive results of particle-in-cell modelling using the 2D3V code SPICE2 for the thermionic emissive sheath of tungsten. Simulations have been performed for various surface temperatures and selected inclinations of the magnetic field corresponding to the leading edge and sloped exposures. The surface temperature dependence of the escaping thermionic current and its limiting value are determined for various plasma parameters; for the leading edge geometry, the results agree remarkably well with the Takamura analytical model. For the sloped geometry, the limiting value is observed to be proportional to the thermal electron current and a simple analytical expression is proposed that accurately reproduces the numerical results.
Vargas-Rodriguez, Everardo; Guzman-Chavez, Ana D.; Cano-Contreras, Martin; Gallegos-Arellano, Eloisa; Jauregui-Vazquez, Daniel; Hernández-García, Juan C.; Estudillo-Ayala, Julian M.; Rojas-Laguna, Roberto
2015-01-01
In this work a refractive index sensor based on a combination of the non-dispersive sensing (NDS) and the Tunable Laser Spectroscopy (TLS) principles is presented. Here, in order to have one reference and one measurement channel a single-beam dual-path configuration is used for implementing the NDS principle. These channels are monitored with a couple of identical optical detectors which are correlated to calculate the overall sensor response, called here the depth of modulation. It is shown that this is useful to minimize drifting errors due to source power variations. Furthermore, a comprehensive analysis of a refractive index sensing setup, based on an intrinsic micro Fabry-Perot Interferometer (FPI) is described. Here, the changes over the FPI pattern as the exit refractive index is varied are analytically modelled by using the characteristic matrix method. Additionally, our simulated results are supported by experimental measurements which are also provided. Finally it is shown that by using this principle a simple refractive index sensor with a resolution in the order of 2.15 × 10−4 RIU can be implemented by using a couple of standard and low cost photodetectors. PMID:26501277
Vargas-Rodriguez, Everardo; Guzman-Chavez, Ana D; Cano-Contreras, Martin; Gallegos-Arellano, Eloisa; Jauregui-Vazquez, Daniel; Hernández-García, Juan C; Estudillo-Ayala, Julian M; Rojas-Laguna, Roberto
2015-10-15
In this work a refractive index sensor based on a combination of the non-dispersive sensing (NDS) and the Tunable Laser Spectroscopy (TLS) principles is presented. Here, in order to have one reference and one measurement channel a single-beam dual-path configuration is used for implementing the NDS principle. These channels are monitored with a couple of identical optical detectors which are correlated to calculate the overall sensor response, called here the depth of modulation. It is shown that this is useful to minimize drifting errors due to source power variations. Furthermore, a comprehensive analysis of a refractive index sensing setup, based on an intrinsic micro Fabry-Perot Interferometer (FPI) is described. Here, the changes over the FPI pattern as the exit refractive index is varied are analytically modelled by using the characteristic matrix method. Additionally, our simulated results are supported by experimental measurements which are also provided. Finally it is shown that by using this principle a simple refractive index sensor with a resolution in the order of 2.15 × 10(-4) RIU can be implemented by using a couple of standard and low cost photodetectors.
Emergent Neutrality in Adaptive Asexual Evolution
Schiffels, Stephan; Szöllősi, Gergely J.; Mustonen, Ville; Lässig, Michael
2011-01-01
In nonrecombining genomes, genetic linkage can be an important evolutionary force. Linkage generates interference interactions, by which simultaneously occurring mutations affect each other’s chance of fixation. Here, we develop a comprehensive model of adaptive evolution in linked genomes, which integrates interference interactions between multiple beneficial and deleterious mutations into a unified framework. By an approximate analytical solution, we predict the fixation rates of these mutations, as well as the probabilities of beneficial and deleterious alleles at fixed genomic sites. We find that interference interactions generate a regime of emergent neutrality: all genomic sites with selection coefficients smaller in magnitude than a characteristic threshold have nearly random fixed alleles, and both beneficial and deleterious mutations at these sites have nearly neutral fixation rates. We show that this dynamic limits not only the speed of adaptation, but also a population’s degree of adaptation in its current environment. We apply the model to different scenarios: stationary adaptation in a time-dependent environment and approach to equilibrium in a fixed environment. In both cases, the analytical predictions are in good agreement with numerical simulations. Our results suggest that interference can severely compromise biological functions in an adapting population, which sets viability limits on adaptive evolution under linkage. PMID:21926305
Developing Best Practices for Detecting Change at Marine Renewable Energy Sites
NASA Astrophysics Data System (ADS)
Linder, H. L.; Horne, J. K.
2016-02-01
In compliance with the National Environmental Policy Act (NEPA), an evaluation of environmental effects is mandatory for obtaining permits for any Marine Renewable Energy (MRE) project in the US. Evaluation includes an assessment of baseline conditions and on-going monitoring during operation to determine if biological conditions change relative to the baseline. Currently, there are no best practices for the analysis of MRE monitoring data. We have developed an approach to evaluate and recommend analytic models used to characterize and detect change in biological monitoring data. The approach includes six steps: review current MRE monitoring practices, identify candidate models to analyze data, fit models to a baseline dataset, develop simulated scenarios of change, evaluate model fit to simulated data, and produce recommendations on the choice of analytic model for monitoring data. An empirical data set from a proposed tidal turbine site at Admiralty Inlet, Puget Sound, Washington was used to conduct the model evaluation. Candidate models that were evaluated included: linear regression, time series, and nonparametric models. Model fit diagnostics Root-Mean-Square-Error and Mean-Absolute-Scaled-Error were used to measure accuracy of predicted values from each model. A power analysis was used to evaluate the ability of each model to measure and detect change from baseline conditions. As many of these models have yet to be applied in MRE monitoring studies, results of this evaluation will generate comprehensive guidelines on choice of model to detect change in environmental monitoring data from MRE sites. The creation of standardized guidelines for model selection enables accurate comparison of change between life stages of a MRE project, within life stages to meet real time regulatory requirements, and comparison of environmental changes among MRE sites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik
Scientists working in a particular domain often adhere to conventional data analysis and presentation methods and this leads to familiarity with these methods over time. But does high familiarity always lead to better analytical judgment? This question is especially relevant when visualizations are used in scientific tasks, as there can be discrepancies between visualization best practices and domain conventions. However, there is little empirical evidence of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their effect on scientific judgment. To address this gap and to study these factors, we focus on the climatemore » science domain, specifically on visualizations used for comparison of model performance. We present a comprehensive user study with 47 climate scientists where we explored the following factors: i) relationships between scientists’ familiarity, their perceived levels of com- fort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less
Analytical evaluation of current starch methods used in the international sugar industry: Part I.
Cole, Marsha; Eggleston, Gillian; Triplett, Alexa
2017-08-01
Several analytical starch methods exist in the international sugar industry to mitigate starch-related processing challenges and assess the quality of traded end-products. These methods use iodometric chemistry, mostly potato starch standards, and utilize similar solubilization strategies, but had not been comprehensively compared. In this study, industrial starch methods were compared to the USDA Starch Research method using simulated raw sugars. Type of starch standard, solubilization approach, iodometric reagents, and wavelength detection affected total starch determination in simulated raw sugars. Simulated sugars containing potato starch were more accurately detected by the industrial methods, whereas those containing corn starch, a better model for sugarcane starch, were only accurately measured by the USDA Starch Research method. Use of a potato starch standard curve over-estimated starch concentrations. Among the variables studied, starch standard, solubilization approach, and wavelength detection affected the sensitivity, accuracy/precision, and limited the detection/quantification of the current industry starch methods the most. Published by Elsevier Ltd.
Book review: Physics of tsunamis
Geist, Eric L.
2017-01-01
“Physics of Tsunamis”, second edition, provides a comprehensive analytical treatment of the hydrodynamics associated with the tsunami generation process. The book consists of seven chapters covering 388 pages. Because the subject matter within each chapter is distinct, an abstract appears at the beginning and references appear at the end of each chapter, rather than at the end of the book. Various topics of tsunami physics are examined largely from a theoretical perspective, although there is little information on how the physical descriptions are applied in numerical models.“Physics of Tsunamis”, by B. W. Levin and M. A. Nosov, Second Edition, Springer, 2016; ISBN-10: 33-1933106X, ISBN-13: 978-331933-1065
ERIC Educational Resources Information Center
Breyer, F. Jay; Rupp, André A.; Bridgeman, Brent
2017-01-01
In this research report, we present an empirical argument for the use of a contributory scoring approach for the 2-essay writing assessment of the analytical writing section of the "GRE"® test in which human and machine scores are combined for score creation at the task and section levels. The approach was designed to replace a currently…
ERIC Educational Resources Information Center
Johnston, Rhona S.; McGeown, Sarah; Watson, Joyce E.
2012-01-01
A comparison was made of 10-year-old boys and girls who had learnt to read by analytic or synthetic phonics methods as part of their early literacy programmes. The boys taught by the synthetic phonics method had better word reading than the girls in their classes, and their spelling and reading comprehension was as good. In contrast, with analytic…
Army Digital Test Requirements Analytic Report.
1983-07-01
NATION4AL BUREAU Of STANOARCA- 963-A RESEARCH AID DEVELOPMENT TECHNICAL REPORT CECOM800520-1 ARMY DIGITAL TEST I O~ ~REGUIREMENTS ANALYTIC REPORT...16I" I i00000TABLE OF COTIWMI (CmnthnePa 3 6.0 DATA REVIEW 17 6.1 COMPREHENSIVE REVIEW 17 3 6.2 REVIEW CONCLUSIONS 17 7.0 SPECIAL RESEARCH 19 8 .0...Identification - Identification of Information Sources S-- Data Collection - Data Organization - Data Review - Special Research - Technology Analysis - Test
Speciated Elemental and Isotopic Characterization of Atmospheric Aerosols - Recent Advances
NASA Astrophysics Data System (ADS)
Shafer, M.; Majestic, B.; Schauer, J.
2007-12-01
Detailed elemental, isotopic, and chemical speciation analysis of aerosol particulate matter (PM) can provide valuable information on PM sources, atmospheric processing, and climate forcing. Certain PM sources may best be resolved using trace metal signatures, and elemental and isotopic fingerprints can supplement and enhance molecular maker analysis of PM for source apportionment modeling. In the search for toxicologically relevant components of PM, health studies are increasingly demanding more comprehensive characterization schemes. It is also clear that total metal analysis is at best a poor surrogate for the bioavailable component, and analytical techniques that address the labile component or specific chemical species are needed. Recent sampling and analytical developments advanced by the project team have facilitated comprehensive characterization of even very small masses of atmospheric PM. Historically; this level of detail was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. These advances have enabled the coupling of advanced chemical characterization to vital field sampling approaches that typically supply only very limited PM mass; e.g. (1) particle size-resolved sampling; (2) personal sampler collections; and (3) fine temporal scale sampling. The analytical tools that our research group is applying include: (1) sector field (high-resolution-HR) ICP-MS, (2) liquid waveguide long-path spectrophotometry (LWG-LPS), and (3) synchrotron x-ray absorption spectroscopy (sXAS). When coupled with an efficient and validated solubilization method, the HR-ICP-MS can provide quantitative elemental information on over 50 elements in microgram quantities of PM. The high mass resolution and enhanced signal-to-noise of HR-ICP-MS significantly advance data quality and quantity over that possible with traditional quadrupole ICP-MS. The LWG-LPS system enables an assessment of the soluble/labile components of PM, while simultaneously providing critical oxidation state speciation data. Importantly, the LWG- LPS can be deployed in a semi-real-time configuration to probe fine temporal scale variations in atmospheric processing or sources of PM. The sXAS is providing complementary oxidation state speciation of bulk PM. Using examples from our research; we will illustrate the capabilities and applications of these new methods.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
NASA Technical Reports Server (NTRS)
Elrod, David; Christensen, Eric; Brown, Andrew
2011-01-01
At NASA/MSFC, Structural Dynamics personnel continue to perform advanced analysis for the turbomachinery in the J2X Rocket Engine, which is under consideration for the new Space Launch System. One of the most challenging analyses in the program is predicting turbine blade structural capability. Resonance was predicted by modal analysis, so comprehensive forced response analyses using high fidelity cyclic symmetric finite element models were initiated as required. Analysis methodologies up to this point have assumed the flow field could be fully described by a sector, so the loading on every blade would be identical as it travelled through it. However, in the J2X the CFD flow field varied over the 360 deg of a revolution because of the flow speeds and tortuous axial path. MSFC therefore developed a complex procedure using Nastran Dmap's and Matlab scripts to apply this circumferentially varying loading onto the cyclically symmetric structural models to produce accurate dynamic stresses for every blade on the disk. This procedure is coupled with static, spin, and thermal loading to produce high cycle fatigue safety factors resulting in much more accurate analytical assessments of the blades.
Environmental Sampling & Analytical Methods (ESAM) Program - Home
ESAM is a comprehensive program to facilitate a coordinated response to a chemical, radiochemical, biotoxin or pathogen contamination incident focusing on sample collection, processing, and analysis to provide quality results to the field.
[Comprehensive evaluation of eco-tourism resources in Yichun forest region of Northeast China].
Huang, Maozhu; Hu, Haiqing; Zhang, Jie; Chen, Lijun
2006-11-01
By using analytical hierarchy process (AHP) and Delphi method, a total of 30 representative evaluation factors in the aspects of tourism resources quantity, environmental quantity, tourism conditions, and tourism functions were chosen to build up a comprehensive quantitative evaluation model to evaluate the eco-tourism resources of Yichun forest region in Northeast China. The results showed that in Yichun forest region, the natural eco-tourism resources were superior to the humanity resources. On the regional distribution of favorable level eco-tourism resources quantity, 4 sites were very prominent, i.e., north (Jiayin) -center (Yichun) -east (Jinshantun) -south (Tieli). As for the distribution of eco-tourism resources type, it was basically in the sequence of north (Jiayin, Tangwang River, Wuying) -center (Yichun, Shangganling) -east (Jinshantun, Meixi) -south (Teli, Dailing). Based on the above analyses, Yichun forest region could be divided into four tourism areas, i.e., the south, the east, the central, and the north. Aimed at the special features of each area, the initial development directions were introduced.
Risk evaluation of highway engineering project based on the fuzzy-AHP
NASA Astrophysics Data System (ADS)
Yang, Qian; Wei, Yajun
2011-10-01
Engineering projects are social activities, which integrate with technology, economy, management and organization. There are uncertainties in each respect of engineering projects, and it needs to strengthen risk management urgently. Based on the analysis of the characteristics of highway engineering, and the study of the basic theory on risk evaluation, the paper built an index system of highway project risk evaluation. Besides based on fuzzy mathematics principle, analytical hierarchy process was used and as a result, the model of the comprehensive appraisal method of fuzzy and AHP was set up for the risk evaluation of express way concessionary project. The validity and the practicability of the risk evaluation of expressway concessionary project were verified after the model was applied to the practice of a project.
Theoretical studies of the physics of the solar atmosphere
NASA Technical Reports Server (NTRS)
Hollweg, Joseph V.
1992-01-01
Significant advances in our theoretical basis for understanding several physical processes related to dynamical phenomena on the sun were achieved. We have advanced a new model for spicules and fibrils. We have provided a simple physical view of resonance absorption of MHD surface waves; this allowed an approximate mathematical procedure for obtaining a wealth of new analytical results which we applied to coronal heating and p-mode absorption at magnetic regions. We provided the first comprehensive models for the heating and acceleration of the transition region, corona, and solar wind. We provided a new view of viscosity under coronal conditions. We provided new insights into Alfven wave propagation in the solar atmosphere. And recently we have begun work in a new direction: parametric instabilities of Alfven waves.
Research on Collection System Optimal Design of Wind Farm with Obstacles
NASA Astrophysics Data System (ADS)
Huang, W.; Yan, B. Y.; Tan, R. S.; Liu, L. F.
2017-05-01
To the collection system optimal design of offshore wind farm, the factors considered are not only the reasonable configuration of the cable and switch, but also the influence of the obstacles on the topology design of the offshore wind farm. This paper presents a concrete topology optimization algorithm with obstacles. The minimal area rectangle encasing box of the obstacle is obtained by using the method of minimal area encasing box. Then the optimization algorithm combining the advantages of Dijkstra algorithm and Prim algorithm is used to gain the scheme of avoidance obstacle path planning. Finally a fuzzy comprehensive evaluation model based on the analytic hierarchy process is constructed to compare the performance of the different topologies. Case studies demonstrate the feasibility of the proposed algorithm and model.
Thermal shock fracture in cross-ply fibre-reinforced ceramic-matrix composites
NASA Astrophysics Data System (ADS)
Kastritseas, C.; Smith, P. A.; Yeomans, J. A.
2010-11-01
The onset of matrix cracking due to thermal shock in a range of simple and multi-layer cross-ply laminates comprising a calcium aluminosilicate (CAS) matrix reinforced with Nicalon® fibres is investigated analytically. A comprehensive stress analysis under conditions of thermal shock, ignoring transient effects, is performed and fracture criteria based on either a recently derived model for the thermal shock resistance of unidirectional Nicalon®/glass ceramic-matrix composites or fracture mechanics considerations are formulated. The effect of material thickness on the apparent thermal shock resistance is also modelled. Comparison with experimental results reveals that the accuracy of the predictions is satisfactory and the reasons for some discrepancies are discussed. In addition, a theoretical argument based on thermal shock theory is formulated to explain the observed cracking patterns.
Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette
2018-05-10
Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.
Coupled-channel model for K ¯ N scattering in the resonant region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernández-Ramírez, Cesar; Danilkin, Igor V.; Manley, D. Mark
2016-02-18
Here, we present a unitary multichannel model formore » $$\\bar{K}$$N scattering in the resonance region that fulfills unitarity. It has the correct analytical properties for the amplitudes once they are extended to the complex-$s$ plane and the partial waves have the right threshold behavior. In order to determine the parameters of the model, we have fitted single-energy partial waves up to J = 7/2 and up to 2.15 GeV of energy in the center-of-mass reference frame obtaining the poles of the Λ* and Σ* resonances, which are compared to previous analyses. Furthermore, we provide the most comprehensive picture of the S = –1 hyperon spectrum to date. Here, important differences are found between the available analyses making the gathering of further experimental information on $$\\bar{K}$$N scattering mandatory to make progress in the assessment of the hyperon spectrum.« less
Dynamical processes and epidemic threshold on nonlinear coupled multiplex networks
NASA Astrophysics Data System (ADS)
Gao, Chao; Tang, Shaoting; Li, Weihua; Yang, Yaqian; Zheng, Zhiming
2018-04-01
Recently, the interplay between epidemic spreading and awareness diffusion has aroused the interest of many researchers, who have studied models mainly based on linear coupling relations between information and epidemic layers. However, in real-world networks the relation between two layers may be closely correlated with the property of individual nodes and exhibits nonlinear dynamical features. Here we propose a nonlinear coupled information-epidemic model (I-E model) and present a comprehensive analysis in a more generalized scenario where the upload rate differs from node to node, deletion rate varies between susceptible and infected states, and infection rate changes between unaware and aware states. In particular, we develop a theoretical framework of the intra- and inter-layer dynamical processes with a microscopic Markov chain approach (MMCA), and derive an analytic epidemic threshold. Our results suggest that the change of upload and deletion rate has little effect on the diffusion dynamics in the epidemic layer.
2010-04-01
analytical community. 5.1 Towards a Common Understanding of CD&E and CD&E Project Management Recent developments within NATO have contributed to the... project management purposes it is useful to distinguish four phases [P 21]: a) Preparation, Initiation and Structuring; b) Concept Development Planning...examined in more detail below. While the NATO CD&E policy provides a benchmark for a comprehensive, disciplined management of CD&E projects , it may
NASA Orbital Debris Engineering Model ORDEM2008 (Beta Version)
NASA Technical Reports Server (NTRS)
Stansbery, Eugene G.; Krisko, Paula H.
2009-01-01
This is an interim document intended to accompany the beta-release of the ORDEM2008 model. As such it provides the user with a guide for its use, a list of its capabilities, a brief summary of model development, and appendices included to educate the user as to typical runtimes for different orbit configurations. More detailed documentation will be delivered with the final product. ORDEM2008 supersedes NASA's previous model - ORDEM2000. The availability of new sensor and in situ data, the re-analysis of older data, and the development of new analytical techniques, has enabled the construction of this more comprehensive and sophisticated model. Integrated with the software is an upgraded graphical user interface (GUI), which uses project-oriented organization and provides the user with graphical representations of numerous output data products. These range from the conventional average debris size vs. flux magnitude for chosen analysis orbits, to the more complex color-contoured two-dimensional (2-D) directional flux diagrams in terms of local spacecraft pitch and yaw.
Nonlocal integral elasticity in nanostructures, mixtures, boundary effects and limit behaviours
NASA Astrophysics Data System (ADS)
Romano, Giovanni; Luciano, Raimondo; Barretta, Raffaele; Diaco, Marina
2018-02-01
Nonlocal elasticity is addressed in terms of integral convolutions for structural models of any dimension, that is bars, beams, plates, shells and 3D continua. A characteristic feature of the treatment is the recourse to the theory of generalised functions (distributions) to provide a unified presentation of previous proposals. Local-nonlocal mixtures are also included in the analysis. Boundary effects of convolutions on bounded domains are investigated, and analytical evaluations are provided in the general case. Methods for compensation of boundary effects are compared and discussed with a comprehensive treatment. Estimates of limit behaviours for extreme values of the nonlocal parameter are shown to give helpful information on model properties, allowing for new comments on previous proposals. Strain-driven and stress-driven models are shown to emerge by swapping the mechanical role of input and output fields in the constitutive convolution, with stress-driven elastic model leading to well-posed problems. Computations of stress-driven nonlocal one-dimensional elastic models are performed to exemplify the theoretical results.
Stylized facts in social networks: Community-based static modeling
NASA Astrophysics Data System (ADS)
Jo, Hang-Hyun; Murase, Yohsuke; Török, János; Kertész, János; Kaski, Kimmo
2018-06-01
The past analyses of datasets of social networks have enabled us to make empirical findings of a number of aspects of human society, which are commonly featured as stylized facts of social networks, such as broad distributions of network quantities, existence of communities, assortative mixing, and intensity-topology correlations. Since the understanding of the structure of these complex social networks is far from complete, for deeper insight into human society more comprehensive datasets and modeling of the stylized facts are needed. Although the existing dynamical and static models can generate some stylized facts, here we take an alternative approach by devising a community-based static model with heterogeneous community sizes and larger communities having smaller link density and weight. With these few assumptions we are able to generate realistic social networks that show most stylized facts for a wide range of parameters, as demonstrated numerically and analytically. Since our community-based static model is simple to implement and easily scalable, it can be used as a reference system, benchmark, or testbed for further applications.
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
Liang, Ruoyu; Song, Shuai; Shi, Yajing; Shi, Yajuan; Lu, Yonglong; Zheng, Xiaoqi; Xu, Xiangbo; Wang, Yurong; Han, Xuesong
2017-12-15
The redundancy or deficiency of selenium in soils can cause adverse effects on crops and even threaten human health. It was necessary to assess selenium resources with a rigorous scientific appraisal. Previous studies of selenium resource assessment were usually carried out using a single index evaluation. A multi-index evaluation method (analytic hierarchy process) was used in this study to establish a comprehensive assessment system based on consideration of selenium content, soil nutrients and soil environmental quality. The criteria for the comprehensive assessment system were classified by summing critical values in the standards with weights and a Geographical Information System was used to reflect the regional distribution of the assessment results. Boshan, a representative region for developing selenium-rich agriculture, was taken as a case area and classified into Zone I-V, which suggested priority areas for developing selenium-rich agriculture. Most parts of the North and Midlands of Boshan were relatively suitable for development of selenium-rich agriculture. Soils in south fractions were contaminated by Cd, PAHs, HCHs and DDTs, in which it was forbidden to farm. This study was expected to provide the basis for developing selenium-rich agriculture and an example for comprehensive evaluation of relevant resources in a region. Copyright © 2017 Elsevier B.V. All rights reserved.
Van Ham, Rita; Van Vaeck, Luc; Adams, Freddy C; Adriaens, Annemie
2004-05-01
The analytical use of mass spectra from static secondary ion mass spectrometry for the molecular identification of inorganic analytes in real life surface layers and microobjects requires an empirical insight in the signals to be expected from a given compound. A comprehensive database comprising over 50 salts has been assembled to complement prior data on oxides. The present study allows the systematic trends in the relationship between the detected signals and molecular composition of the analyte to be delineated. The mass spectra provide diagnostic information by means of atomic ions, structural fragments, molecular ions, and adduct ions of the analyte neutrals. The prediction of mass spectra from a given analyte must account for the charge state of the ions in the salt, the formation of oxide-type neutrals from oxy salts, and the occurrence of oxidation-reduction processes.
Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd
2015-01-01
The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585
An indoor air quality evaluation in a residential retrofit project using spray polyurethane foam.
Tian, Shen; Ecoff, Scott; Sebroski, John; Miller, Jason; Rickenbacker, Harold; Bilec, Melissa
2018-05-01
Understanding of indoor air quality (IAQ) during and after spray polyurethane foam (SPF) application is essential to protect the health of both workers and building occupants. Previous efforts such as field monitoring, micro-chamber/spray booth emission studies, and fate/transport modeling have been conducted to understand the chemical exposure of SPF and guide risk mitigation strategies. However, each type of research has its limitation and can only reveal partial information on the relationship between SPF and IAQ. A comprehensive study is truly needed to integrate the experimental design and analytical testing methods in the field/chamber studies with the mathematical tools employed in the modeling studies. This study aims to bridge this gap and provide a more comprehensive understanding on the impact of SPF to IAQ. The field sampling plan of this research aims to evaluate the airborne concentrations of methylene diphenyl diisocyanate (MDI), formaldehyde, acetaldehyde, propionaldehyde, tris(1-chlor-2-propyl)phosphate (TCPP), trans-1-chloro-3,3,3-trifluoropropene (Solstice TM ), and airborne particles. Modifications to existing MDI sampling and analytical methods were made so that level of quantification was improved. In addition, key fate and transport modeling input parameters such as air changes per hour and airborne particle size distribution were measured. More importantly, TCPP accumulation onto materials was evaluated, which is important to study the fate and transport of semi-volatile organic compounds. The IAQ results showed that after spray application was completed in the entire building, airborne concentrations decreased for all chemicals monitored. However, it is our recommendation that during SPF application, no one should return to the application site without proper personal protection equipment as long as there are active spray activities in the building. The comparison between this field study and a recent chamber study proved surface sorption and particle deposition is an important factor in determining the fate of airborne TCPP. The study also suggests the need for further evaluation by employing mathematical models, proving the data generated in this work as informative to industry and the broader scientific community.
Popplow, Marcus
2015-12-01
Recent critical approaches to what has conventionally been described as "scientific" and "technical" knowledge in early modern Europe have provided a wealth of new insights. So far, the various analytical concepts suggested by these studies have not yet been comprehensively discussed. The present essay argues that such comprehensive approaches might prove of special value for long-term and cross-cultural reflections on technology-related knowledge. As heuristic tools, the notions of "formalization" and "interaction" are proposed as part of alternative narratives to those highlighting the emergence of "science" as the most relevant development for technology-related knowledge in early modern Europe.
Challenges to Applying a Metamodel for Groundwater Flow Beyond Underlying Numerical Model Boundaries
NASA Astrophysics Data System (ADS)
Reeves, H. W.; Fienen, M. N.; Feinstein, D.
2015-12-01
Metamodels of environmental behavior offer opportunities for decision support, adaptive management, and increased stakeholder engagement through participatory modeling and model exploration. Metamodels are derived from calibrated, computationally demanding, numerical models. They may potentially be applied to non-modeled areas to provide screening or preliminary analysis tools for areas that do not yet have the benefit of more comprehensive study. In this decision-support mode, they may be fulfilling a role often accomplished by application of analytical solutions. The major challenge to transferring a metamodel to a non-modeled area is how to quantify the spatial data in the new area of interest in such a way that it is consistent with the data used to derive the metamodel. Tests based on transferring a metamodel derived from a numerical groundwater-flow model of the Lake Michigan Basin to other glacial settings across the northern U.S. show that the spatial scale of the numerical model must be appropriately scaled to adequately represent different settings. Careful GIS analysis of the numerical model, metamodel, and new area of interest is required for successful transfer of results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fields, Laura; Genser, Krzysztof; Hatcher, Robert
Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. Thismore » raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.« less
Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh
2014-01-01
Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.
Watson, Nathanial E; Prebihalo, Sarah E; Synovec, Robert E
2017-08-29
Comprehensive three-dimensional gas chromatography with time-of-flight mass spectrometry (GC 3 -TOFMS) creates an opportunity to explore a new paradigm in chemometric analysis. Using this newly described instrument and the well understood Parallel Factor Analysis (PARAFAC) model we present one option for utilization of the novel GC 3 -TOFMS data structure. We present a method which builds upon previous work in both GC 3 and targeted analysis using PARAFAC to simplify some of the implementation challenges previously discovered. Conceptualizing the GC 3 -TOFMS instead as a one-dimensional gas chromatograph with GC × GC-TOFMS detection we allow the instrument to create the PARAFAC target window natively. Each first dimension modulation thus creates a full GC × GC-TOFMS chromatogram fully amenable to PARAFAC. A simple mixture of 115 compounds and a diesel sample are interrogated through this methodology. All test analyte targets are successfully identified in both mixtures. In addition, mass spectral matching of the PARAFAC loadings to library spectra yielded results greater than 900 in 40 of 42 test analyte cases. Twenty-nine of these cases produced match values greater than 950. Copyright © 2017 Elsevier B.V. All rights reserved.
Hoggard, Jamin C; Wahl, Jon H; Synovec, Robert E; Mong, Gary M; Fraga, Carlos G
2010-01-15
In this report we present the feasibility of using analytical and chemometric methodologies to reveal and exploit the chemical impurity profiles from commercial dimethyl methylphosphonate (DMMP) samples to illustrate the type of forensic information that may be obtained from chemical-attack evidence. Using DMMP as a model compound of a toxicant that may be used in a chemical attack, we used comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry (GC x GC/TOF-MS) to detect and identify trace organic impurities in six samples of commercially acquired DMMP. The GC x GC/TOF-MS data was analyzed to produce impurity profiles for all six DMMP samples using 29 analyte impurities. The use of PARAFAC for the mathematical resolution of overlapped GC x GC peaks ensured clean spectra for the identification of many of the detected analytes by spectral library matching. The use of statistical pairwise comparison revealed that there were trace impurities that were quantitatively similar and different among five of the six DMMP samples. Two of the DMMP samples were revealed to have identical impurity profiles by this approach. The use of nonnegative matrix factorization indicated that there were five distinct DMMP sample types as illustrated by the clustering of the multiple DMMP analyses into five distinct clusters in the scores plots. The two indistinguishable DMMP samples were confirmed by their chemical supplier to be from the same bulk source. Sample information from the other chemical suppliers supported the idea that the other four DMMP samples were likely from different bulk sources. These results demonstrate that the matching of synthesized products from the same source is possible using impurity profiling. In addition, the identified impurities common to all six DMMP samples provide strong evidence that basic route information can be obtained from impurity profiles. Finally, impurities that may be unique to the sole bulk manufacturer of DMMP were found in some of the DMMP samples.
NASA Astrophysics Data System (ADS)
Takeda, M.; Nakajima, H.; Zhang, M.; Hiratsuka, T.
2008-04-01
To obtain reliable diffusion parameters for diffusion testing, multiple experiments should not only be cross-checked but the internal consistency of each experiment should also be verified. In the through- and in-diffusion tests with solution reservoirs, test interpretation of different phases often makes use of simplified analytical solutions. This study explores the feasibility of steady, quasi-steady, equilibrium and transient-state analyses using simplified analytical solutions with respect to (i) valid conditions for each analytical solution, (ii) potential error, and (iii) experimental time. For increased generality, a series of numerical analyses are performed using unified dimensionless parameters and the results are all related to dimensionless reservoir volume (DRV) which includes only the sorptive parameter as an unknown. This means the above factors can be investigated on the basis of the sorption properties of the testing material and/or tracer. The main findings are that steady, quasi-steady and equilibrium-state analyses are applicable when the tracer is not highly sorptive. However, quasi-steady and equilibrium-state analyses become inefficient or impractical compared to steady state analysis when the tracer is non-sorbing and material porosity is significantly low. Systematic and comprehensive reformulation of analytical models enables the comparison of experimental times between different test methods. The applicability and potential error of each test interpretation can also be studied. These can be applied in designing, performing, and interpreting diffusion experiments by deducing DRV from the available information for the target material and tracer, combined with the results of this study.
Nutritional Lipidomics: Molecular Metabolism, Analytics, and Diagnostics
Smilowitz, Jennifer T.; Zivkovic, Angela M.; Wan, Yu-Jui Yvonne; Watkins, Steve M.; Nording, Malin L.; Hammock, Bruce D.; German, J. Bruce
2013-01-01
The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of mass spectrometry with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments-lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways. PMID:23818328
Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M
2018-03-05
The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative analysis of heterogeneous data types in the development of complex botanicals such as polyphenols for eventual clinical and translational applications.
Sun, Yong-Guang; Zhao, Dong-Zhi; Zhang, Feng-Shou; Wei, Bao-Quan; Chu, Jia-Lan; Su, Xiu
2012-11-01
Based on the aerial image data of Dayang estuary in 2008, and by virtue of Analytic Hierarchy Process (AHP) , remote sensing technology, and GIS spatial analysis, a spatiotemporal evaluation was made on the comprehensive level of wetland environmental pollution risk in Dayang estuary, with the impacts of typical human activities on the dynamic variation of this comprehensive level discussed. From 1958 to 2008, the comprehensive level of the environmental pollution risk in study area presented an increasing trend. Spatially, this comprehensive level declined from land to ocean, and showed a zonal distribution. Tourism development activities unlikely led to the increase of the comprehensive level, while human inhabitation, transportation, and aquaculture would exacerbate the risk of environmental pollution. This study could provide reference for the sea area use planning, ecological function planning, and pollutants control of estuary region.
A RECONNECTION-DRIVEN MODEL OF THE HARD X-RAY LOOP-TOP SOURCE FROM FLARE 2004 FEBRUARY 26
DOE Office of Scientific and Technical Information (OSTI.GOV)
Longcope, Dana; Qiu, Jiong; Brewer, Jasmine
A compact X-class flare on 2004 February 26 showed a concentrated source of hard X-rays at the tops of the flare’s loops. This was analyzed in previous work and interpreted as plasma heated and compressed by slow magnetosonic shocks (SMSs) generated during post-reconnection retraction of the flux. That work used analytic expressions from a thin flux tube (TFT) model, which neglected many potentially important factors such as thermal conduction and chromospheric evaporation. Here we use a numerical solution of the TFT equations to produce a more comprehensive and accurate model of the same flare, including those effects previously omitted. Thesemore » simulations corroborate the prior hypothesis that slow-mode shocks persist well after the retraction has ended, thus producing a compact, loop-top source instead of an elongated jet, as steady reconnection models predict. Thermal conduction leads to densities higher than analytic estimates had predicted, and evaporation enhances the density still higher, but at lower temperatures. X-ray light curves and spectra are synthesized by convolving the results from a single TFT simulation with the rate at which flux is reconnected, as measured through motion of flare ribbons, for example. These agree well with light curves observed by RHESSI and GOES and spectra from RHESSI . An image created from a superposition of TFT model runs resembles one produced from RHESSI observations. This suggests that the HXR loop-top source, at least the one observed in this flare, could be the result of SMSs produced in fast reconnection models like Petschek’s.« less
Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling.
Lucchese, Marcello; Borisenko, Oleg; Mantovani, Lorenzo Giovanni; Cortesi, Paolo Angelo; Cesana, Giancarlo; Adam, Daniel; Burdukova, Elisabeth; Lukyanov, Vasily; Di Lorenzo, Nicola
2017-01-01
To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years) and a long-term (lifetime) horizon. A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs). Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis. © 2017 The Author(s) Published by S. Karger GmbH, Freiburg.
Freye, Chris E; Moore, Nicholas R; Synovec, Robert E
2018-02-16
The complementary information provided by tandem ionization time-of-flight mass spectrometry (TI-TOFMS) is investigated for comparative discovery-based analysis, when coupled with comprehensive two-dimensional gas chromatography (GC × GC). The TI conditions implemented were a hard ionization energy (70 eV) concurrently collected with a soft ionization energy (14 eV). Tile-based Fisher ratio (F-ratio) analysis is used to analyze diesel fuel spiked with twelve analytes at a nominal concentration of 50 ppm. F-ratio analysis is a supervised discovery-based technique that compares two different sample classes, in this case spiked and unspiked diesel, to reduce the complex GC × GC-TI-TOFMS data into a hit list of class distinguishing analyte features. Hit lists of the 70 eV and 14 eV data sets, and the single hit list produced when the two data sets are fused together, are all investigated. For the 70 eV hit list, eleven of the twelve analytes were found in the top thirteen hits. For the 14 eV hit list, nine of the twelve analytes were found in the top nine hits, with the other three analytes either not found or well down the hit list. As expected, the F-ratios per m/z used to calculate each average F-ratio per hit were generally smaller fragment ions for the 70 eV data set, while the larger fragment ions were emphasized in the 14 eV data set, supporting the notion that complementary information was provided. The discovery rate was improved when F-ratio analysis was performed on the fused data sets resulted in eleven of the twelve analytes being at the top of the single hit list. Using PARAFAC, analytes that were "discovered" were deconvoluted in order to obtain their identification via match values (MV). Location of the analytes and the "F-ratio spectra" obtained from F-ratio analysis were used to guide the deconvolution. Eight of the twelve analytes where successfully deconvoluted and identified using the in-house library for the 70 eV data set. PARAFAC deconvolution of the two separate data sets provided increased confidence in identification of "discovered" analytes. Herein, we explore the limit of analyte discovery and limit of analyte identification, and demonstrate a general workflow for the investigation of key chemical features in complex samples. Copyright © 2018 Elsevier B.V. All rights reserved.
Yan, Xia; Wang, Li-Juan; Wu, Zhen; Wu, Yun-Long; Liu, Xiu-Xiu; Chang, Fang-Rong; Fang, Mei-Juan; Qiu, Ying-Kun
2016-10-15
Microbial metabolites represent an important source of bioactive natural products, but always exhibit diverse of chemical structures or complicated chemical composition with low active ingredients content. Traditional separation methods rely mainly on off-line combination of open-column chromatography and preparative high performance liquid chromatography (HPLC). However, the multi-step and prolonged separation procedure might lead to exposure to oxygen and structural transformation of metabolites. In the present work, a new two-dimensional separation workflow for fast isolation and analysis of microbial metabolites from Chaetomium globosum SNSHI-5, a cytotoxic fungus derived from extreme environment. The advantage of this analytical comprehensive two-dimensional liquid chromatography (2D-LC) lies on its ability to analyze the composition of the metabolites, and to optimize the separation conditions for the preparative 2D-LC. Furthermore, gram scale preparative 2D-LC separation of the crude fungus extract could be performed on a medium-pressure liquid chromatograph×preparative high-performance liquid chromatography system, under the optimized condition. Interestingly, 12 cytochalasan derivatives, including two new compounds named cytoglobosin Ab (3) and isochaetoglobosin Db (8), were successfully obtained with high purity in a short period of time. The structures of the isolated metabolites were comprehensively characterized by HR ESI-MS and NMR. To be highlighted, this is the first report on the combination of analytical and preparative 2D-LC for the separation of microbial metabolites. The new workflow exhibited apparent advantages in separation efficiency and sample treatment capacity compared with conventional methods. Copyright © 2016 Elsevier B.V. All rights reserved.
Photogrammetry Applied to Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Liu, Tian-Shu; Cattafesta, L. N., III; Radeztsky, R. H.; Burner, A. W.
2000-01-01
In image-based measurements, quantitative image data must be mapped to three-dimensional object space. Analytical photogrammetric methods, which may be used to accomplish this task, are discussed from the viewpoint of experimental fluid dynamicists. The Direct Linear Transformation (DLT) for camera calibration, used in pressure sensitive paint, is summarized. An optimization method for camera calibration is developed that can be used to determine the camera calibration parameters, including those describing lens distortion, from a single image. Combined with the DLT method, this method allows a rapid and comprehensive in-situ camera calibration and therefore is particularly useful for quantitative flow visualization and other measurements such as model attitude and deformation in production wind tunnels. The paper also includes a brief description of typical photogrammetric applications to temperature- and pressure-sensitive paint measurements and model deformation measurements in wind tunnels.
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.
Comprehensive dynamic analysis of a bladed disk-turborotor-bearing system
NASA Astrophysics Data System (ADS)
Kaushal, Ashok
The dynamic behavior of a bladed disk-turborotor-bearing system is studied employing analytical, numerical, and experimental methods. The system consists of several subsystems such as turbine disk, blades, bearings, support pedestals etc. In order to completely understand the dynamic behavior of the turborotor system an appropriate model for each individual component of the system is first developed. The individual components are modeled to include various design parameters and the effect of these parameters on the vibrational behavior is studied. The vibration studies on the individual components are carried out using Rayleigh-Ritz method boundary characteristic orthogonal polynomials as assumed shape functions. The individual components are then assembled using the finite element technique. The turborotor system is studied from a system point of view and the natural frequencies and mode shapes are obtained for various rotational speeds. The results show that the natural frequencies of the system are different from those obtained by analyzing individual components, suggesting that a system approach must be adopted for proper design of a turborotor system. The amplitude of vibration and stresses due to harmonic and centrifugal loading on the blades and the disk are also obtained. The results indicate that for the turborotor speed of operation, the centrifugal loading is the major factor in determining the critical stresses in comparison to the gas forces on the blade modeled as harmonic loading. Experimental validation of the analytical model is carried out and suggestions for future work are given.
Sustainability Tools Inventory Initial Gap Analysis
This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...
Analysis of past NBI ratings to determine future bridge preservation needs.
DOT National Transportation Integrated Search
2004-01-01
Bridge Management System (BMS) needs an analytical tool that can predict bridge element deterioration and answer questions related to bridge preservation. PONTIS, a comprehensive BMS software, was developed to serve this purpose. However, the intensi...
Experimental and analytical research on the aerodynamics of wind driven turbines. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrbach, C.; Wainauski, H.; Worobel, R.
1977-12-01
This aerodynamic research program was aimed at providing a reliable, comprehensive data base on a series of wind turbine models covering a broad range of the prime aerodynamic and geometric variables. Such data obtained under controlled laboratory conditions on turbines designed by the same method, of the same size, and tested in the same wind tunnel had not been available in the literature. Moreover, this research program was further aimed at providing a basis for evaluating the adequacy of existing wind turbine aerodynamic design and performance methodology, for assessing the potential of recent advanced theories and for providing a basismore » for further method development and refinement.« less
Thermodynamically self-consistent theory for the Blume-Capel model.
Grollau, S; Kierlik, E; Rosinberg, M L; Tarjus, G
2001-04-01
We use a self-consistent Ornstein-Zernike approximation to study the Blume-Capel ferromagnet on three-dimensional lattices. The correlation functions and the thermodynamics are obtained from the solution of two coupled partial differential equations. The theory provides a comprehensive and accurate description of the phase diagram in all regions, including the wing boundaries in a nonzero magnetic field. In particular, the coordinates of the tricritical point are in very good agreement with the best estimates from simulation or series expansion. Numerical and analytical analysis strongly suggest that the theory predicts a universal Ising-like critical behavior along the lambda line and the wing critical lines, and a tricritical behavior governed by mean-field exponents.
Biosurveillance Using Clinical Diagnoses and Social Media Indicators in Military Populations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corley, Courtney D.; Volkova, Svitlana; Rounds, Jeremiah
U.S. military influenza surveillance uses electronic reporting of clinical diagnoses to monitor health of military personnel and detect naturally occurring and bioterrorism-related epidemics. While accurate, these systems lack in timeliness. More recently, researchers have used novel data sources to detect influenza in real time and capture nontraditional populations. With data-mining techniques, military social media users are identified and influenza-related discourse is integrated along with medical data into a comprehensive disease model. By leveraging heterogeneous data streams and developing dashboard biosurveillance analytics, the researchers hope to increase the speed at which outbreaks are detected and provide accurate disease forecasting among militarymore » personnel.« less
Consistent Partial Least Squares Path Modeling via Regularization
Jung, Sunho; Park, JaeHong
2018-01-01
Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present. PMID:29515491
Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan
2013-04-01
Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.
Tighe, Elizabeth L.; Schatschneider, Christopher
2015-01-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in Adult Basic Education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. PMID:25351773
Hondow, Nicole; Brown, M Rowan; Starborg, Tobias; Monteith, Alexander G; Brydson, Rik; Summers, Huw D; Rees, Paul; Brown, Andy
2016-02-01
Semiconductor quantum dot nanoparticles are in demand as optical biomarkers yet the cellular uptake process is not fully understood; quantification of numbers and the fate of internalized particles are still to be achieved. We have focussed on the characterization of cellular uptake of quantum dots using a combination of analytical electron microscopies because of the spatial resolution available to examine uptake at the nanoparticle level, using both imaging to locate particles and spectroscopy to confirm identity. In this study, commercially available quantum dots, CdSe/ZnS core/shell particles coated in peptides to target cellular uptake by endocytosis, have been investigated in terms of the agglomeration state in typical cell culture media, the traverse of particle agglomerates across U-2 OS cell membranes during endocytosis, the merging of endosomal vesicles during incubation of cells and in the correlation of imaging flow cytometry and transmission electron microscopy to measure the final nanoparticle dose internalized by the U-2 OS cells. We show that a combination of analytical transmission electron microscopy and serial block face scanning electron microscopy can provide a comprehensive description of the internalization of an initial exposure dose of nanoparticles by an endocytically active cell population and how the internalized, membrane bound nanoparticle load is processed by the cells. We present a stochastic model of an endosome merging process and show that this provides a data-driven modelling framework for the prediction of cellular uptake of engineered nanoparticles in general. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; DeLessio, Jennifer L.; Jacobs, Preston W.
2018-01-01
Many structures in the launch vehicle industry operate in liquid hydrogen (LH2), from the hydrogen fuel tanks through the ducts and valves and into the pump sides of the turbopumps. Calculating the structural dynamic response of these structures is critical for successful qualification of this hardware, but accurate knowledge of the natural frequencies is based entirely on numerical or analytical predictions of frequency reduction due to the added-fluid-mass effect because testing in LH2 has always been considered too difficult and dangerous. This fluid effect is predicted to be approximately 4-5% using analytical formulations for simple cantilever beams. As part of a comprehensive test/analysis program to more accurately assess pump inducers operating in LH2, a series of frequency tests in LH2 were performed at NASA/Marshall Space Flight Center's unique cryogenic test facility. These frequency tests are coupled with modal tests in air and water to provide critical information not only on the mass effect of LH2, but also the cryogenic temperature effect on Young's Modulus for which the data is not extensive. The authors are unaware of any other reported natural frequency testing in this media. In addition to the inducer, a simple cantilever beam was also tested in the tank to provide a more easily modeled geometry as well as one that has an analytical solution for the mass effect. This data will prove critical for accurate structural dynamic analysis of these structures, which operate in a highly-dynamic environment.
Video game training does not enhance cognitive ability: A comprehensive meta-analytic investigation.
Sala, Giovanni; Tatlidil, K Semir; Gobet, Fernand
2018-02-01
As a result of considerable potential scientific and societal implications, the possibility of enhancing cognitive ability by training has been one of the most influential topics of cognitive psychology in the last two decades. However, substantial research into the psychology of expertise and a recent series of meta-analytic reviews have suggested that various types of cognitive training (e.g., working memory training) benefit performance only in the trained tasks. The lack of skill generalization from one domain to different ones-that is, far transfer-has been documented in various fields of research such as working memory training, music, brain training, and chess. Video game training is another activity that has been claimed by many researchers to foster a broad range of cognitive abilities such as visual processing, attention, spatial ability, and cognitive control. We tested these claims with three random-effects meta-analytic models. The first meta-analysis (k = 310) examined the correlation between video game skill and cognitive ability. The second meta-analysis (k = 315) dealt with the differences between video game players and nonplayers in cognitive ability. The third meta-analysis (k = 359) investigated the effects of video game training on participants' cognitive ability. Small or null overall effect sizes were found in all three models. These outcomes show that overall cognitive ability and video game skill are only weakly related. Importantly, we found no evidence of a causal relationship between playing video games and enhanced cognitive ability. Video game training thus represents no exception to the general difficulty of obtaining far transfer. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Novel predictive models for metabolic syndrome risk: a "big data" analytic approach.
Steinberg, Gregory B; Church, Bruce W; McCall, Carol J; Scott, Adam B; Kalis, Brian P
2014-06-01
We applied a proprietary "big data" analytic platform--Reverse Engineering and Forward Simulation (REFS)--to dimensions of metabolic syndrome extracted from a large data set compiled from Aetna's databases for 1 large national customer. Our goals were to accurately predict subsequent risk of metabolic syndrome and its various factors on both a population and individual level. The study data set included demographic, medical claim, pharmacy claim, laboratory test, and biometric screening results for 36,944 individuals. The platform reverse-engineered functional models of systems from diverse and large data sources and provided a simulation framework for insight generation. The platform interrogated data sets from the results of 2 Comprehensive Metabolic Syndrome Screenings (CMSSs) as well as complete coverage records; complete data from medical claims, pharmacy claims, and lab results for 2010 and 2011; and responses to health risk assessment questions. The platform predicted subsequent risk of metabolic syndrome, both overall and by risk factor, on population and individual levels, with ROC/AUC varying from 0.80 to 0.88. We demonstrated that improving waist circumference and blood glucose yielded the largest benefits on subsequent risk and medical costs. We also showed that adherence to prescribed medications and, particularly, adherence to routine scheduled outpatient doctor visits, reduced subsequent risk. The platform generated individualized insights using available heterogeneous data within 3 months. The accuracy and short speed to insight with this type of analytic platform allowed Aetna to develop targeted cost-effective care management programs for individuals with or at risk for metabolic syndrome.
Forced degradation and impurity profiling: recent trends in analytical perspectives.
Jain, Deepti; Basniwal, Pawan Kumar
2013-12-01
This review describes an epigrammatic impression of the recent trends in analytical perspectives of degradation and impurities profiling of pharmaceuticals including active pharmaceutical ingredient (API) as well as drug products during 2008-2012. These recent trends in forced degradation and impurity profiling were discussed on the head of year of publication; columns, matrix (API and dosage forms) and type of elution in chromatography (isocratic and gradient); therapeutic categories of the drug which were used for analysis. It focuses distinctly on comprehensive update of various analytical methods including hyphenated techniques for the identification and quantification of thresholds of impurities and degradants in different pharmaceutical matrices. © 2013 Elsevier B.V. All rights reserved.
Selection of reference standard during method development using the analytical hierarchy process.
Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun
2015-03-25
Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Luo, Lin
2017-08-01
In the practical selection of Wushu athletes, the objective evaluation of the level of athletes lacks sufficient technical indicators and often relies on the coach’s subjective judgments. It is difficult to accurately and objectively reflect the overall quality of the athletes without a fully quantified indicator system, thus affecting the level improvement of Wushu competition. The analytic hierarchy process (AHP) is a systemic analysis method combining quantitative and qualitative analysis. This paper realizes structured, hierarchized and quantified decision-making process of evaluating broadsword, rod, sword and spear athletes in the AHP. Combing characteristics of the athletes, analysis is carried out from three aspects, i.e., the athlete’s body shape, physical function and sports quality and 18 specific evaluation indicators established, and then combining expert advice and practical experience, pairwise comparison matrix is determined, and then the weight of the indicators and comprehensive evaluation coefficient are obtained to establish the evaluation model for the athletes, thus providing a scientific theoretical basis for the selection of Wushu athletes. The evaluation model proposed in this paper has realized the evaluation system of broadsword, rod, sword and spear athletes, which has effectively improved the scientific level of Wushu athletes selection in practical application.
Lean Mixture Engines Testing and Evaluation Program : Volume 2. Comprehensive Discussion.
DOT National Transportation Integrated Search
1975-11-01
This report is aimed at defining analytically and demonstrating experimentally the potential of the 'lean-burn concept.' Fuel consumption and emissions data are obtained on the engine dynamometer for the baseline engine, and two lean-burn configurati...
NASA Technical Reports Server (NTRS)
Rana, D. S.
1980-01-01
The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.
TRANSIENT DUPUIT INTERFACE FLOW WITH PARTIALLY PENETRATING FEATURES
A comprehensive potential is presented for Dupuit interface flow in coastal aquifers where both the fresh water and salt water are moving. The resulting potential flow problem may be solved, for incompressible confined aquifers, using analytic functions. The vertical velocity of ...
WATER ANALYSIS: EMERGING CONTAMINANTS AND CURRENT ISSUES
This review covers developments in Water Analysis over the period of 2001-2002. A few significant references that appeared between January and February 2003 are also included. Previous Water Analysis reviews have been very comprehensive; however, in 2001, Analytical Chemistry c...
NASA Technical Reports Server (NTRS)
Bogan, Sam
2001-01-01
The first year included a study of the non-visible damage of composite overwrapped pressure vessels with B. Poe of the Materials Branch of Nasa-Langley. Early determinations showed a clear reduction in non-visible damage for thin COPVs when partially pressurized rather than unpressurized. Literature searches on Thicker-wall COPVs revealed surface damage but clearly visible. Analysis of current Analytic modeling indicated that that current COPV models lacked sufficient thickness corrections to predict impact damage. After a comprehensive study of available published data and numerous numerical studies based on observed data from Langley, the analytic framework for modeling the behavior was determined lacking and both Poe and Bogan suggested any short term (3yr) result for Jove would be overly ambitious and emphasis should be placed on transverse shear moduli studies. Transverse shear moduli determination is relevant to the study of fatigue, fracture and aging effects in composite structures. Based on the techniques developed by Daniel & Tsai, Bogan and Gates determined to verify the results for K3B and 8320. A detailed analytic and experimental plan was established and carried out that included variations in layup, width, thickness, and length. As well as loading rate variations to determine effects and relaxation moduli. The additional axial loads during the torsion testing were studied as was the placement of gages along the composite specimen. Of the proposed tasks, all of tasks I and 2 were completed with presentations given at Langley, SEM conferences and ASME/AIAA conferences. Sensitivity issues with the technique associated with the use of servohydraulic test systems for applying the torsional load to the composite specimen limited the torsion range for predictable and repeatable transverse shear properties. Bogan and Gates determined to diverge on research efforts with Gates continuing the experimental testing at Langley and Bogan modeling the apparent non-linear behavior at low torque & angles apparent from the tests.
Kler, Pablo A; Huhn, Carolin
2014-11-01
Isotachophoresis (ITP) has long been used alone but also as a preconcentration technique for capillary electrophoresis (CE). Unfortunately, up to now, its application is restricted to relatively strong acids and bases as either the degree of (de)protonation is too low or the water dissociation is too high, evoking zone electrophoresis. With the comprehensive ITP analysis of all 20 proteinogenic amino acids as model analytes, we, here, show that non-aqueous ITP using dimethylsulfoxide as a solvent solves this ITP shortcoming. Dimethylsulfoxide changes the pH regime of analytes and electrolytes but, more importantly, strongly reduces the proton mobility by prohibiting hydrogen bonds and thus, the so-called Zundel-Eigen-Zundel electrical conduction mechanism of flipping hydrogen bonds. The effects are demonstrated in an electrolyte system with taurine or H(+) as terminator, and imidazole as leader together with strong acids such as oxalic and even trifluoroacetic acid as counterions, both impossible to use in aqueous solution. Mass spectrometric as well as capacitively coupled contactless conductivity detection (C(4)D) are used to follow the ITP processes. To demonstrate the preconcentration capabilities of ITP in a two-dimensional set-up, we, here, also demonstrate that our non-aqueous ITP method can be combined with capillary electrophoresis-mass spectrometry in a column-coupling system using a hybrid approach of capillaries coupled to a microfluidic interface. For this, C(4)D was optimized for on-chip detection with the electrodes aligned on top of a thin glass lid of the microfluidic chip.
Stefanuto, Pierre-Hugues; Perrault, Katelynn A; Stadler, Sonja; Pesesse, Romain; LeBlanc, Helene N; Forbes, Shari L; Focant, Jean-François
2015-06-01
In forensic thanato-chemistry, the understanding of the process of soft tissue decomposition is still limited. A better understanding of the decomposition process and the characterization of the associated volatile organic compounds (VOC) can help to improve the training of victim recovery (VR) canines, which are used to search for trapped victims in natural disasters or to locate corpses during criminal investigations. The complexity of matrices and the dynamic nature of this process require the use of comprehensive analytical methods for investigation. Moreover, the variability of the environment and between individuals creates additional difficulties in terms of normalization. The resolution of the complex mixture of VOCs emitted by a decaying corpse can be improved using comprehensive two-dimensional gas chromatography (GC × GC), compared to classical single-dimensional gas chromatography (1DGC). This study combines the analytical advantages of GC × GC coupled to time-of-flight mass spectrometry (TOFMS) with the data handling robustness of supervised multivariate statistics to investigate the VOC profile of human remains during early stages of decomposition. Various supervised multivariate approaches are compared to interpret the large data set. Moreover, early decomposition stages of pig carcasses (typically used as human surrogates in field studies) are also monitored to obtain a direct comparison of the two VOC profiles and estimate the robustness of this human decomposition analog model. In this research, we demonstrate that pig and human decomposition processes can be described by the same trends for the major compounds produced during the early stages of soft tissue decomposition.
Agapiou, A; Zorba, E; Mikedi, K; McGregor, L; Spiliopoulou, C; Statheropoulos, M
2015-07-09
Field experiments were devised to mimic the entrapment conditions under the rubble of collapsed buildings aiming to investigate the evolution of volatile organic compounds (VOCs) during the early dead body decomposition stage. Three pig carcasses were placed inside concrete tunnels of a search and rescue (SAR) operational field terrain for simulating the entrapment environment after a building collapse. The experimental campaign employed both laboratory and on-site analytical methods running in parallel. The current work focuses only on the results of the laboratory method using thermal desorption coupled to comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry (TD-GC×GC-TOF MS). The flow-modulated TD-GC×GC-TOF MS provided enhanced separation of the VOC profile and served as a reference method for the evaluation of the on-site analytical methods in the current experimental campaign. Bespoke software was used to deconvolve the VOC profile to extract as much information as possible into peak lists. In total, 288 unique VOCs were identified (i.e., not found in blank samples). The majority were aliphatics (172), aromatics (25) and nitrogen compounds (19), followed by ketones (17), esters (13), alcohols (12), aldehydes (11), sulfur (9), miscellaneous (8) and acid compounds (2). The TD-GC×GC-TOF MS proved to be a sensitive and powerful system for resolving the chemical puzzle of above-ground "scent of death". Copyright © 2015 Elsevier B.V. All rights reserved.
Kim, Jihyun; Yum, Hyesun; Jang, Moonhee; Shin, Ilchung; Yang, Wonkyung; Baeck, Seungkyung; Suh, Joon Hyuk; Lee, Sooyeun; Han, Sang Beom
2016-01-01
Hair is a highly relevant specimen that is used to verify drug exposure in victims of drug-facilitated crime (DFC) cases. In the present study, a new analytical method involving ultrahigh-performance liquid chromatography-tandem mass spectrometry was developed for determining the presence of model drugs, including zolazepam and tiletamine and their metabolites in hair specimens from DFCs. The incorporation of zolazepam and tiletamine into hair after a single exposure was investigated in Long-Evans rats with the ratio of the hair concentration to the area under the curve. For rapid and simple sample preparation, methanol extraction and protein precipitation were performed for hair and plasma, respectively. No interference was observed in drug-free hair or plasma, except for hair-derived diphenhydramine in blank hair. The coefficients of variance of the matrix effects were below 12%, and the recoveries of the analytes exceeded 70% in all of the matrices. The precision and accuracy results were satisfactory. The limits of quantification ranged from 20 to 50 pg in 10 mg of hair. The drug incorporation rates were 0.03 ± 0.01% for zolazepam and 2.09 ± 0.51% for tiletamine in pigmented hair. We applied the present method to real hair samples in order to determine the drug that was used in seven cases. These results suggest that this comprehensive and sensitive hair analysis method can successfully verify a drug after a single exposure in crimes and can be applied in forensic and clinical toxicology laboratories.
Recent Progress in Understanding the Shock Response of Ferroelectric Ceramics
NASA Astrophysics Data System (ADS)
Setchell, R. E.
2002-07-01
Ferroelectric ceramics exhibit a permanent remanent polarization, and shock depoling of these materials to achieve pulsed sources of electrical power was proposed in the late 1950s. During the following twenty years, extensive studies were conducted to examine the shock response of ferroelectric ceramics primarily based on lead zirconate titanate (PZT). Under limited conditions, relatively simple analytical models were found to adequately describe the observed electrical behavior. A more complex behavior was indicated over broader conditions, however, resulting in the incorporation of shock-induced conductivity and dielectric relaxation into analytical models. Unfortunately, few experimental studies were undertaken over the next twenty years, and the development of more comprehensive models was inhibited. In recent years, a strong interest in advancing numerical simulation capabilities has motivated new experimental studies and corresponding model development. More than seventy gas gun experiments have examined several ferroelectric ceramics, with most experiments on lead zirconate titanate having a Zr:Ti ratio of 95:5 and modified with 2% niobium (PZT 95/5). This material is nominally ferroelectric but is near an antiferroelectric phase boundary, and depoling results from a shock-driven phase transition. Experiments have examined unpoled, normally poled, and axially poled PZT 95/5 over broad ranges of shock pressure and peak electric field. The extensive base of new data provides quantitative insights into both the stress and field dependencies of depoling kinetics, and the significance of pore collapse at higher stresses. The results are being actively utilized to develop and refine material response models used in numerical simulations of pulsed power devices.
Experimental validation of a sub-surface model of solar power for distributed marine sensor systems
NASA Astrophysics Data System (ADS)
Hahn, Gregory G.; Cantin, Heather P.; Shafer, Michael W.
2016-04-01
The capabilities of distributed sensor systems such as marine wildlife telemetry tags could be significantly enhanced through the integration of photovoltaic modules. Photovoltaic cells could be used to supplement the primary batteries for wildlife telemetry tags to allow for extended tag deployments, wherein larger amounts of data could be collected and transmitted in near real time. In this article, we present experimental results used to validate and improve key aspects of our original model for sub-surface solar power. We discuss the test methods and results, comparing analytic predictions to experimental results. In a previous work, we introduced a model for sub-surface solar power that used analytic models and empirical data to predict the solar irradiance available for harvest at any depth under the ocean's surface over the course of a year. This model presented underwater photovoltaic transduction as a viable means of supplementing energy for marine wildlife telemetry tags. The additional data provided by improvements in daily energy budgets would enhance the temporal and spatial comprehension of the host's activities and/or environments. Photovoltaic transduction is one method that has not been widely deployed in the sub-surface marine environments despite widespread use on terrestrial and avian species wildlife tag systems. Until now, the use of photovoltaic cells for underwater energy harvesting has generally been disregarded as a viable energy source in this arena. In addition to marine telemetry systems, photovoltaic energy harvesting systems could also serve as a means of energy supply for autonomous underwater vehicles (AUVs), as well as submersible buoys for oceanographic data collection.
Development of dynamic Bayesian models for web application test management
NASA Astrophysics Data System (ADS)
Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.
2018-03-01
The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.
Fluid dynamics of aortic valve stenosis
NASA Astrophysics Data System (ADS)
Keshavarz-Motamed, Zahra; Maftoon, Nima
2009-11-01
Aortic valve stenosis, which causes considerable constriction of the flow passage, is one of the most frequent cardiovascular diseases and is the most common cause of the valvular replacements which take place for around 100,000 per year in North America. Furthermore, it is considered as the most frequent cardiac disease after arterial hypertension and coronary artery disease. The objective of this study is to develop an analytical model considering the coupling effect between fluid flow and elastic deformation with reasonable boundary conditions to describe the effect of AS on the left ventricle and the aorta. The pulsatile and Newtonian blood flow through aortic stenosis with vascular wall deformability is analyzed and its effects are discussed in terms of flow parameters such as velocity, resistance to flow, shear stress distribution and pressure loss. Meanwhile we developed analytical expressions to improve the comprehension of the transvalvular hemodynamics and the aortic stenosis hemodynamics which is of great interest because of one main reason. To medical scientists, an accurate knowledge of the mechanical properties of whole blood flow in the aorta can suggest a new diagnostic tool.
Taste clusters of music and drugs: evidence from three analytic levels.
Vuolo, Mike; Uggen, Christopher; Lageson, Sarah
2014-09-01
This article examines taste clusters of musical preferences and substance use among adolescents and young adults. Three analytic levels are considered: fixed effects analyses of aggregate listening patterns and substance use in US radio markets, logistic regressions of individual genre preferences and drug use from a nationally representative survey of US youth, and arrest and seizure data from a large American concert venue. A consistent picture emerges from all three levels: rock music is positively associated with substance use, with some substance-specific variability across rock sub-genres. Hip hop music is also associated with higher use, while pop and religious music are associated with lower use. These results are robust to fixed effects models that account for changes over time in radio markets, a comprehensive battery of controls in the individual-level survey, and concert data establishing the co-occurrence of substance use and music listening in the same place and time. The results affirm a rich tradition of qualitative and experimental studies, demonstrating how symbolic boundaries are simultaneously drawn around music and drugs. © London School of Economics and Political Science 2014.
A practical guide to big data research in psychology.
Chen, Eric Evan; Wojcik, Sean P
2016-12-01
The massive volume of data that now covers a wide variety of human behaviors offers researchers in psychology an unprecedented opportunity to conduct innovative theory- and data-driven field research. This article is a practical guide to conducting big data research, covering data management, acquisition, processing, and analytics (including key supervised and unsupervised learning data mining methods). It is accompanied by walkthrough tutorials on data acquisition, text analysis with latent Dirichlet allocation topic modeling, and classification with support vector machines. Big data practitioners in academia, industry, and the community have built a comprehensive base of tools and knowledge that makes big data research accessible to researchers in a broad range of fields. However, big data research does require knowledge of software programming and a different analytical mindset. For those willing to acquire the requisite skills, innovative analyses of unexpected or previously untapped data sources can offer fresh ways to develop, test, and extend theories. When conducted with care and respect, big data research can become an essential complement to traditional research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
Cargnin, Sarah; Jommi, Claudio; Canonico, Pier Luigi; Genazzani, Armando A; Terrazzino, Salvatore
2014-05-01
To determine diagnostic accuracy of HLA-B*57:01 testing for prediction of abacavir-induced hypersensitivity and to quantify the clinical benefit of pretreatment screening through a meta-analytic review of published studies. A comprehensive search was performed up to June 2013. The methodological quality of relevant studies was assessed by the QUADAS-2 tool. The pooled diagnostic estimates were calculated using a random effect model. Despite the presence of heterogeneity in sensitivity or specificity estimates, the pooled diagnostic odds ratio to detect abacavir-induced hypersensitivity on the basis of clinical criteria was 33.07 (95% CI: 22.33-48.97, I(2): 13.9%), while diagnostic odds ratio for detection of immunologically confirmed abacavir hypersensitivity was 1141 (95% CI: 409-3181, I(2): 0%). Pooled analysis of risk ratio showed that prospective HLA-B*57:01 testing significantly reduced the incidence of abacavir-induced hypersensitivity. This meta-analysis demonstrates an excellent diagnostic accuracy of HLA-B*57:01 testing to detect immunologically confirmed abacavir hypersensitivity and corroborates existing recommendations.
The Viking X ray fluorescence experiment - Analytical methods and early results
NASA Technical Reports Server (NTRS)
Clark, B. C., III; Castro, A. J.; Rowe, C. D.; Baird, A. K.; Rose, H. J., Jr.; Toulmin, P., III; Christian, R. P.; Kelliher, W. C.; Keil, K.; Huss, G. R.
1977-01-01
Ten samples of the Martian regolith have been analyzed by the Viking lander X ray fluorescence spectrometers. Because of high-stability electronics, inclusion of calibration targets, and special data encoding within the instruments the quality of the analyses performed on Mars is closely equivalent to that attainable with the same instruments operated in the laboratory. Determination of absolute elemental concentrations requires gain drift adjustments, subtraction of background components, and use of a mathematical response model with adjustable parameters set by prelaunch measurements on selected rock standards. Bulk fines at both Viking landing sites are quite similar in composition, implying that a chemically and mineralogically homogeneous regolith covers much of the surface of the planet. Important differences between samples include a higher sulfur content in what appear to be duricrust fragments than in fines and a lower iron content in fines taken from beneath large rocks than those taken from unprotected surface material. Further extensive reduction of these data will allow more precise and more accurate analytical numbers to be determined and thus a more comprehensive understanding of elemental trends between samples.
Conn, Vicki S; Ruppar, Todd M; Chase, Jo-Ana D; Enriquez, Maithe; Cooper, Pamela S
2015-12-01
This systematic review applied meta-analytic procedures to synthesize medication adherence interventions that focus on adults with hypertension. Comprehensive searching located trials with medication adherence behavior outcomes. Study sample, design, intervention characteristics, and outcomes were coded. Random-effects models were used in calculating standardized mean difference effect sizes. Moderator analyses were conducted using meta-analytic analogues of ANOVA and regression to explore associations between effect sizes and sample, design, and intervention characteristics. Effect sizes were calculated for 112 eligible treatment-vs.-control group outcome comparisons of 34,272 subjects. The overall standardized mean difference effect size between treatment and control subjects was 0.300. Exploratory moderator analyses revealed interventions were most effective among female, older, and moderate- or high-income participants. The most promising intervention components were those linking adherence behavior with habits, giving adherence feedback to patients, self-monitoring of blood pressure, using pill boxes and other special packaging, and motivational interviewing. The most effective interventions employed multiple components and were delivered over many days. Future research should strive for minimizing risks of bias common in this literature, especially avoiding self-report adherence measures.
Over forty years of bladder cancer glycobiology: Where do glycans stand facing precision oncology?
Azevedo, Rita; Peixoto, Andreia; Gaiteiro, Cristiana; Fernandes, Elisabete; Neves, Manuel; Lima, Luís; Santos, Lúcio Lara; Ferreira, José Alexandre
2017-01-01
The high molecular heterogeneity of bladder tumours is responsible for significant variations in disease course, as well as elevated recurrence and progression rates, thereby hampering the introduction of more effective targeted therapeutics. The implementation of precision oncology settings supported by robust molecular models for individualization of patient management is warranted. This effort requires a comprehensive integration of large sets of panomics data that is yet to be fully achieved. Contributing to this goal, over 40 years of bladder cancer glycobiology have disclosed a plethora of cancer-specific glycans and glycoconjugates (glycoproteins, glycolipids, proteoglycans) accompanying disease progressions and dissemination. This review comprehensively addresses the main structural findings in the field and consequent biological and clinical implications. Given the cell surface and secreted nature of these molecules, we further discuss their potential for non-invasive detection and therapeutic development. Moreover, we highlight novel mass-spectrometry-based high-throughput analytical and bioinformatics tools to interrogate the glycome in the postgenomic era. Ultimately, we outline a roadmap to guide future developments in glycomics envisaging clinical implementation. PMID:29207682
Patel, Chirag J
2017-01-01
Mixtures, or combinations and interactions between multiple environmental exposures, are hypothesized to be causally linked with disease and health-related phenotypes. Established and emerging molecular measurement technologies to assay the exposome , the comprehensive battery of exposures encountered from birth to death, promise a new way of identifying mixtures in disease in the epidemiological setting. In this opinion, we describe the analytic complexity and challenges in identifying mixtures associated with phenotype and disease. Existing and emerging machine-learning methods and data analytic approaches (e.g., "environment-wide association studies" [EWASs]), as well as large cohorts may enhance possibilities to identify mixtures of correlated exposures associated with phenotypes; however, the analytic complexity of identifying mixtures is immense. If the exposome concept is realized, new analytical methods and large sample sizes will be required to ascertain how mixtures are associated with disease. The author recommends documenting prevalent correlated exposures and replicated main effects prior to identifying mixtures.
Bend-Twist Coupled Carbon-Fiber Laminate Beams: Fundamental Behavior and Applications
NASA Astrophysics Data System (ADS)
Babuska, Pavel
Material-induced bend-twist coupling in laminated composite beams has seen applications in engineered structures for decades, ranging from airplane wings to turbine blades. Symmetric, unbalanced, carbon fiber laminates which exhibit bend-twist coupling can be difficult to characterize and exhibit unintuitive deformation states which may pose challenges to the engineer. In this thesis, bend-twist coupled beams are investigated comprehensively, by experimentation, numerical modeling, and analytical methods. Beams of varying fiber angle and amount of coupling were manufactured and physically tested in both linear and nonlinear static and dynamic settings. Analytical mass and stiffness matrices were derived for the development of a beam element to use in the stiffness matrix analysis method. Additionally, an ABAQUS finite element model was used in conjunction with the analytical methods to predict and further characterize the behavior of the beams. The three regimes, experimental, analytical, and numerical, represent a full-field characterization of bend-twist coupling in composite beams. A notable application of bend-twist coupled composites is for passively adaptive turbine blades whereby the deformation coupling can be built into the blade structure to simultaneously bend and twist, thus pitching the blade into or away from the fluid flow, changing the blade angle of attack. Passive pitch adaptation has been implemented successfully in wind turbine blades, however, for marine turbine blades, the technology is still in the development phase. Bend-twist coupling has been shown numerically to be beneficial to the tidal turbine performance, however little validation has been conducted in the experimental regime. In this thesis, passively adaptive experiment scale tidal turbine blades were designed, analyzed, manufactured, and physically tested, validating the foundational numerical work. It was shown that blade forces and root moments as well as turbine thrust and power coefficients can be manipulated by inclusion of passive pitch adaption by bend-twist coupling.
Visual analytics for semantic queries of TerraSAR-X image content
NASA Astrophysics Data System (ADS)
Espinoza-Molina, Daniela; Alonso, Kevin; Datcu, Mihai
2015-10-01
With the continuous image product acquisition of satellite missions, the size of the image archives is considerably increasing every day as well as the variety and complexity of their content, surpassing the end-user capacity to analyse and exploit them. Advances in the image retrieval field have contributed to the development of tools for interactive exploration and extraction of the images from huge archives using different parameters like metadata, key-words, and basic image descriptors. Even though we count on more powerful tools for automated image retrieval and data analysis, we still face the problem of understanding and analyzing the results. Thus, a systematic computational analysis of these results is required in order to provide to the end-user a summary of the archive content in comprehensible terms. In this context, visual analytics combines automated analysis with interactive visualizations analysis techniques for an effective understanding, reasoning and decision making on the basis of very large and complex datasets. Moreover, currently several researches are focused on associating the content of the images with semantic definitions for describing the data in a format to be easily understood by the end-user. In this paper, we present our approach for computing visual analytics and semantically querying the TerraSAR-X archive. Our approach is mainly composed of four steps: 1) the generation of a data model that explains the information contained in a TerraSAR-X product. The model is formed by primitive descriptors and metadata entries, 2) the storage of this model in a database system, 3) the semantic definition of the image content based on machine learning algorithms and relevance feedback, and 4) querying the image archive using semantic descriptors as query parameters and computing the statistical analysis of the query results. The experimental results shows that with the help of visual analytics and semantic definitions we are able to explain the image content using semantic terms and the relations between them answering questions such as what is the percentage of urban area in a region? or what is the distribution of water bodies in a city?
Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W
2016-01-01
A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.
Comparison between Two Methods for agricultural drought disaster risk in southwestern China
NASA Astrophysics Data System (ADS)
han, lanying; zhang, qiang
2016-04-01
The drought is a natural disaster, which lead huge loss to agricultural yield in the world. The drought risk has become increasingly prominent because of the climatic warming during the past century, and which is also one of the main meteorological disasters and serious problem in southwestern China, where drought risk exceeds the national average. Climate change is likely to exacerbate the problem, thereby endangering Chinaʹs food security. In this paper, drought disaster in the southwestern China (where there are serious drought risk and the comprehensive loss accounted for 3.9% of national drought area) were selected to show the drought change under climate change, and two methods were used to assess the drought disaster risk, drought risk assessment model and comprehensive drought risk index. Firstly, we used the analytic hierarchy process and meteorological, geographic, soil, and remote-sensing data to develop a drought risk assessment model (defined using a comprehensive drought disaster risk index, R) based on the drought hazard, environmental vulnerability, sensitivity and exposure of the values at risk, and capacity to prevent or mitigate the problem. Second, we built the comprehensive drought risk index (defined using a comprehensive drought disaster loss, L) based on statistical drought disaster data, including crop yields, drought-induced areas, drought-occurred areas, no harvest areas caused by drought and planting areas. Using the model, we assessed the drought risk. The results showed that spatial distribution of two drought disaster risks were coherent, and revealed complex zonality in southwestern China. The results also showed the drought risk is becoming more and more serious and frequent in the country under the global climatic warming background. The eastern part of the study area had an extremely high risk, and risk was generally greater in the north than in the south, and increased from southwest to northeast. The drought disaster risk or loss was highest in Sichuan Province and Chongqing Municipality. It was lowest in Yunnan province. The comprehensive drought disaster loss were uptrend in nearly 60 years, and the trend of drought occurrence in nearly 60 years was overall upward in every province of Xinan region. Drought risk of all provinces has certain relationship with the regional climate change, such as temperature and precipitation, soil moisture and vegetation coverage. The contribution of the risk factors to R was highest for the capacity for prevention and mitigation, followed by the drought hazard, sensitivity and exposure, and environmental vulnerability.
NASA Astrophysics Data System (ADS)
Black, S.; Hynek, B. M.; Kierein-Young, K. S.; Avard, G.; Alvarado-Induni, G.
2015-12-01
Proper characterization of mineralogy is an essential part of geologic interpretation. This process becomes even more critical when attempting to interpret the history of a region remotely, via satellites and/or landed spacecraft. Orbiters and landed missions to Mars carry with them a wide range of analytical tools to aid in the interpretation of Mars' geologic history. However, many instruments make a single type of measurement (e.g., APXS: elemental chemistry; XRD: mineralogy), and multiple data sets must be utilized to develop a comprehensive understanding of a sample. Hydrothermal alteration products often exist in intimate mixtures, and vary widely across a site due to changing pH, temperature, and fluid/gas chemistries. These characteristics require that we develop a detailed understanding regarding the possible mineral mixtures that may exist, and their detectability in different instrument data sets. This comparative analysis study utilized several analytical methods on existing or planned Mars rovers (XRD Raman, LIBS, Mössbauer, and APXS) combined with additional characterization (thin section, VNIR, XRF, SEM-EMP) to develop a comprehensive suite of data for hydrothermal alteration products collected from Poás and Turrialba volcanoes in Costa Rica. Analyzing the same samples across a wide range of instruments allows for direct comparisons of results, and identification of instrumentation "blind spots." This provides insight into the ability of in-situ analyses to comprehensively characterize sites on Mars exhibiting putative hydrothermal characteristics, such as the silica and sulfate deposits at Gusev crater [eg: Squyres et al., 2008], as well as valuable information for future mission planning and data interpretation. References: Squyres et al. (2008), Detection of Silica-Rich Deposits on Mars, Science, 320, 1063-1067, doi:10.1126/science.1155429.
Silva, Raquel V S; Tessarolo, Nathalia S; Pereira, Vinícius B; Ximenes, Vitor L; Mendes, Fábio L; de Almeida, Marlon B B; Azevedo, Débora A
2017-03-01
The elucidation of bio-oil composition is important to evaluate the processes of biomass conversion and its upgrading, and to suggest the proper use for each sample. Comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry (GC×GC-TOFMS) is a widely applied analytical approach for bio-oil investigation due to the higher separation and resolution capacity from this technique. This work addresses the issue of analytical performance to assess the comprehensive characterization of real bio-oil samples via GC×GC-TOFMS. The approach was applied to the individual quantification of compounds of real thermal (PWT), catalytic process (CPO), and hydrodeoxygenation process (HDO) bio-oils. Quantification was performed with reliability using the analytical curves of oxygenated and hydrocarbon standards as well as the deuterated internal standards. The limit of quantification was set at 1ngµL -1 for major standards, except for hexanoic acid, which was set at 5ngµL -1 . The GC×GC-TOFMS method provided good precision (<10%) and excellent accuracy (recovery range of 70-130%) for the quantification of individual hydrocarbons and oxygenated compounds in real bio-oil samples. Sugars, furans, and alcohols appear as the major constituents of the PWT, CPO, and HDO samples, respectively. In order to obtain bio-oils with better quality, the catalytic pyrolysis process may be a better option than hydrogenation due to the effective reduction of oxygenated compound concentrations and the lower cost of the process, when hydrogen is not required to promote deoxygenation in the catalytic pyrolysis process. Copyright © 2016 Elsevier B.V. All rights reserved.
Crock, J.G.; Smith, D.B.; Yager, T.J.B.; Berry, C.J.; Adams, M.G.
2009-01-01
Since late 1993, Metro Wastewater Reclamation District of Denver (Metro District), a large wastewater treatment plant in Denver, Colo., has applied Grade I, Class B biosolids to about 52,000 acres of nonirrigated farmland and rangeland near Deer Trail, Colo. (U.S.A.). In cooperation with the Metro District in 1993, the U.S. Geological Survey (USGS) began monitoring groundwater at part of this site. In 1999, the USGS began a more comprehensive monitoring study of the entire site to address stakeholder concerns about the potential chemical effects of biosolids applications to water, soil, and vegetation. This more comprehensive monitoring program has recently been extended through 2010. Monitoring components of the more comprehensive study include biosolids collected at the wastewater treatment plant, soil, crops, dust, alluvial and bedrock groundwater, and stream-bed sediment. Streams at the site are dry most of the year, so samples of stream-bed sediment deposited after rain were used to indicate surface-water effects. This report will present only analytical results for the biosolids samples collected at the Metro District wastewater treatment plant in Denver and analyzed during 2008. Crock and others have presented earlier a compilation of analytical results for the biosolids samples collected and analyzed for 1999 thru 2006, and in a separate report, data for the 2007 biosolids are reported. More information about the other monitoring components is presented elsewhere in the literature. Priority parameters for biosolids identified by the stakeholders and also regulated by Colorado when used as an agricultural soil amendment include the total concentrations of nine trace elements (arsenic, cadmium, copper, lead, mercury, molybdenum, nickel, selenium, and zinc), plutonium isotopes, and gross alpha and beta activity. Nitrogen and chromium also were priority parameters for groundwater and sediment components.
Di, Xin; Shellie, Robert A; Marriott, Philip J; Huie, Carmen W
2004-04-01
The coupling of headspace solid-phase microextraction (HS-SPME) with comprehensive two-dimensional gas chromatography (GC x GC) was shown to be a powerful technique for the rapid sampling and analysis of volatile oils in complex herbal materials. When compared to one-dimensional (1-D) GC, the improved analytical capabilities of GC x GC in terms of increased detection sensitivity and separation power were demonstrated by using HS-SPME/GC x GC for the chemical profiling (fingerprinting) of essential/volatile oils contained in herbal materials of increasing analytical complexity. More than 20 marker compounds belonging to Panax quinquefolius (American ginseng) can be observed within the 2-D contour plots of ginseng itself, a mixture of ginseng and another important herb (P. quinquefolius/Radix angelicae sinensis), as well as a mixture of ginseng and three other herbs (P. quinquefolius /R. angelicae sinensis/R. astragali/R. rehmanniae preparata). Such analytical capabilities should be important towards the authentication and quality control of herbal products, which are receiving increasing attention as alternative medicines worldwide. In particular, the presence of Panax in the herb formulation could be readily identified through its specific peak pattern in the 2-D GC x GC plot.
DEVELOPING THE TRANSDISCIPLINARY AGING RESEARCH AGENDA: NEW DEVELOPMENTS IN BIG DATA.
Callaghan, Christian William
2017-07-19
In light of dramatic advances in big data analytics and the application of these advances in certain scientific fields, new potentialities exist for breakthroughs in aging research. Translating these new potentialities to research outcomes for aging populations, however, remains a challenge, as underlying technologies which have enabled exponential increases in 'big data' have not yet enabled a commensurate era of 'big knowledge,' or similarly exponential increases in biomedical breakthroughs. Debates also reveal differences in the literature, with some arguing big data analytics heralds a new era associated with the 'end of theory' or which makes the scientific method obsolete, where correlation supercedes causation, whereby science can advance without theory and hypotheses testing. On the other hand, others argue theory cannot be subordinate to data, no matter how comprehensive data coverage can ultimately become. Given these two tensions, namely between exponential increases in data absent exponential increases in biomedical research outputs, and between the promise of comprehensive data coverage and data-driven inductive versus theory-driven deductive modes of enquiry, this paper seeks to provide a critical review of certain theory and literature that offers useful perspectives of certain developments in big data analytics and their theoretical implications for aging research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Tebani, Abdellah; Afonso, Carlos; Bekri, Soumeya
2018-05-01
Metabolites are small molecules produced by enzymatic reactions in a given organism. Metabolomics or metabolic phenotyping is a well-established omics aimed at comprehensively assessing metabolites in biological systems. These comprehensive analyses use analytical platforms, mainly nuclear magnetic resonance spectroscopy and mass spectrometry, along with associated separation methods to gather qualitative and quantitative data. Metabolomics holistically evaluates biological systems in an unbiased, data-driven approach that may ultimately support generation of hypotheses. The approach inherently allows the molecular characterization of a biological sample with regard to both internal (genetics) and environmental (exosome, microbiome) influences. Metabolomics workflows are based on whether the investigator knows a priori what kind of metabolites to assess. Thus, a targeted metabolomics approach is defined as a quantitative analysis (absolute concentrations are determined) or a semiquantitative analysis (relative intensities are determined) of a set of metabolites that are possibly linked to common chemical classes or a selected metabolic pathway. An untargeted metabolomics approach is a semiquantitative analysis of the largest possible number of metabolites contained in a biological sample. This is part I of a review intending to give an overview of the state of the art of major metabolic phenotyping technologies. Furthermore, their inherent analytical advantages and limits regarding experimental design, sample handling, standardization and workflow challenges are discussed.
Recent advances in analytical satellite theory
NASA Technical Reports Server (NTRS)
Gaposchkin, E. M.
1978-01-01
Recent work on analytical satellite perturbation theory has involved the completion of a revision to 4th order for zonal harmonics, the addition of a treatment for ocean tides, an extension of the treatment for the noninertial reference system, and the completion of a theory for direct solar-radiation pressure and earth-albedo pressure. Combined with a theory for tesseral-harmonics, lunisolar, and body-tide perturbations, these formulations provide a comprehensive orbit-computation program. Detailed comparisons with numerical integration and observations are presented to assess the accuracy of each theoretical development.
NASA Technical Reports Server (NTRS)
Meyer, Tom; Zubrin, Robert
1997-01-01
The first phase of the research includes a comprehensive analytical study examining the potential applications for engineering subsystems and mission strategies made possible by such RWGS based subsystems, and will include an actual experimental demonstration and performance characterization of a full-scale brassboard RWGS working unit. By the time of this presentation the laboratory demonstration unit will not yet be operational but we will present the results of our analytical studies to date and plans for the ongoing work.
MetaMetaDB: a database and analytic system for investigating microbial habitability.
Yang, Ching-chia; Iwasaki, Wataru
2014-01-01
MetaMetaDB (http://mmdb.aori.u-tokyo.ac.jp/) is a database and analytic system for investigating microbial habitability, i.e., how a prokaryotic group can inhabit different environments. The interaction between prokaryotes and the environment is a key issue in microbiology because distinct prokaryotic communities maintain distinct ecosystems. Because 16S ribosomal RNA (rRNA) sequences play pivotal roles in identifying prokaryotic species, a system that comprehensively links diverse environments to 16S rRNA sequences of the inhabitant prokaryotes is necessary for the systematic understanding of the microbial habitability. However, existing databases are biased to culturable prokaryotes and exhibit limitations in the comprehensiveness of the data because most prokaryotes are unculturable. Recently, metagenomic and 16S rRNA amplicon sequencing approaches have generated abundant 16S rRNA sequence data that encompass unculturable prokaryotes across diverse environments; however, these data are usually buried in large databases and are difficult to access. In this study, we developed MetaMetaDB (Meta-Metagenomic DataBase), which comprehensively and compactly covers 16S rRNA sequences retrieved from public datasets. Using MetaMetaDB, users can quickly generate hypotheses regarding the types of environments a prokaryotic group may be adapted to. We anticipate that MetaMetaDB will improve our understanding of the diversity and evolution of prokaryotes.
Comprehensive genetic testing for female and male infertility using next-generation sequencing.
Patel, Bonny; Parets, Sasha; Akana, Matthew; Kellogg, Gregory; Jansen, Michael; Chang, Chihyu; Cai, Ying; Fox, Rebecca; Niknazar, Mohammad; Shraga, Roman; Hunter, Colby; Pollock, Andrew; Wisotzkey, Robert; Jaremko, Malgorzata; Bisignano, Alex; Puig, Oscar
2018-05-19
To develop a comprehensive genetic test for female and male infertility in support of medical decisions during assisted reproductive technology (ART) protocols. We developed a next-generation sequencing (NGS) gene panel consisting of 87 genes including promoters, 5' and 3' untranslated regions, exons, and selected introns. In addition, sex chromosome aneuploidies and Y chromosome microdeletions were analyzed concomitantly using the same panel. The NGS panel was analytically validated by retrospective analysis of 118 genomic DNA samples with known variants in loci representative of female and male infertility. Our results showed analytical accuracy of > 99%, with > 98% sensitivity for single-nucleotide variants (SNVs) and > 91% sensitivity for insertions/deletions (indels). Clinical sensitivity was assessed with samples containing variants representative of male and female infertility, and it was 100% for SNVs/indels, CFTR IVS8-5T variants, sex chromosome aneuploidies, and copy number variants (CNVs) and > 93% for Y chromosome microdeletions. Cost analysis shows potential savings when comparing this single NGS assay with the standard approach, which includes multiple assays. A single, comprehensive, NGS panel can simplify the ordering process for healthcare providers, reduce turnaround time, and lower the overall cost of testing for genetic assessment of infertility in females and males, while maintaining accuracy.
Nicolotti, Luca; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Sgorbini, Barbara; Rubiolo, Patrizia; Bicchi, Carlo
2013-10-10
The study proposes an investigation strategy that simultaneously provides detailed profiling and quantitative fingerprinting of food volatiles, through a "comprehensive" analytical platform that includes sample preparation by Headspace Solid Phase Microextraction (HS-SPME), separation by two-dimensional comprehensive gas chromatography coupled with mass spectrometry detection (GC×GC-MS) and data processing using advanced fingerprinting approaches. Experiments were carried out on roasted hazelnuts and on Gianduja pastes (sugar, vegetable oil, hazelnuts, cocoa, nonfat dried milk, vanilla flavorings) and demonstrated that the information potential of each analysis can better be exploited if suitable quantitation methods are applied. Quantitation approaches through Multiple Headspace Extraction and Standard Addition were compared in terms of performance parameters (linearity, precision, accuracy, Limit of Detection and Limit of Quantitation) under headspace linearity conditions. The results on 19 key analytes, potent odorants, and technological markers, and more than 300 fingerprint components, were used for further processing to obtain information concerning the effect of the matrix on volatile release, and to produce an informative chemical blueprint for use in sensomics and flavoromics. The importance of quantitation approaches in headspace analysis of solid matrices of complex composition, and the advantages of MHE, are also critically discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Elbadawi, Abdulateef; Mirghani, Hyder
2016-01-01
Comprehensive correct HIV/AIDS knowledge (CCAK) is defined as correctly identify the two major ways of preventing the sexual transmission of HIV, and reject the most common misconceptions about HIV transmission. There are limited studies on this topic in Sudan. In this study we investigated the Comprehensive correct HIV/AIDS knowledge among Universities students. A cross-sectional analytic study was conducted among 556 students from two universities in 2014. Data were collected by using the self-administered pre-tested structured questionnaire. Chi-square was used for testing the significance and P. Value of ≥ 0.05 is considered as statistically significant. The majority (97.1%) of study subjects have heard about a disease called HIV/AIDS, while only 28.6% of them knew anyone who is infected with AIDS in the local community. Minority (13.8%) of students had CCAK however, males showed a better level of CCAK than females (OR = 2.77) with high significant statistical differences (P. Value = 0.001). Poor rate of CCAK among university students is noticed, especially among females. Almost half of students did not know preventive measures of HIV, nearly two thirds had misconception, about one third did not know the mode of transmission of HIV.
NASA Astrophysics Data System (ADS)
Bagni, T.; Duchateau, J. L.; Breschi, M.; Devred, A.; Nijhuis, A.
2017-09-01
Cable-in-conduit conductors (CICCs) for ITER magnets are subjected to fast changing magnetic fields during the plasma-operating scenario. In order to anticipate the limitations of conductors under the foreseen operating conditions, it is essential to have a better understanding of the stability margin of magnets. In the last decade ITER has launched a campaign for characterization of several types of NbTi and Nb3Sn CICCs comprising quench tests with a singular sine wave fast magnetic field pulse and relatively small amplitude. The stability tests, performed in the SULTAN facility, were reproduced and analyzed using two codes: JackPot-AC/DC, an electromagnetic-thermal numerical model for CICCs, developed at the University of Twente (van Lanen and Nijhuis 2010 Cryogenics 50 139-148) and multi-constant-model (MCM) (Turck and Zani 2010 Cryogenics 50 443-9), an analytical model for CICCs coupling losses. The outputs of both codes were combined with thermal, hydraulic and electric analysis of superconducting cables to predict the minimum quench energy (MQE) (Bottura et al 2000 Cryogenics 40 617-26). The experimental AC loss results were used to calibrate the JackPot and MCM models and to reproduce the energy deposited in the cable during an MQE test. The agreement between experiments and models confirm a good comprehension of the various CICCs thermal and electromagnetic phenomena. The differences between the analytical MCM and numerical JackPot approaches are discussed. The results provide a good basis for further investigation of CICC stability under plasma scenario conditions using magnetic field pulses with lower ramp rate and higher amplitude.
Eco-analytical Methodology in Environmental Problems Monitoring
NASA Astrophysics Data System (ADS)
Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.
2017-01-01
Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward
2016-01-01
In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.
Achieving Cost Reduction Through Data Analytics.
Rocchio, Betty Jo
2016-10-01
The reimbursement structure of the US health care system is shifting from a volume-based system to a value-based system. Adopting a comprehensive data analytics platform has become important to health care facilities, in part to navigate this shift. Hospitals generate plenty of data, but actionable analytics are necessary to help personnel interpret and apply data to improve practice. Perioperative services is an important revenue-generating department for hospitals, and each perioperative service line requires a tailored approach to be successful in managing outcomes and controlling costs. Perioperative leaders need to prepare to use data analytics to reduce variation in supplies, labor, and overhead. Mercy, based in Chesterfield, Missouri, adopted a perioperative dashboard that helped perioperative leaders collaborate with surgeons and perioperative staff members to organize and analyze health care data, which ultimately resulted in significant cost savings. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Little, Callie W; Haughbrook, Rasheda; Hart, Sara A
2017-01-01
Numerous twin studies have examined the genetic and environmental etiology of reading comprehension, though it is likely that etiological estimates are influenced by unidentified sample conditions (e.g. Tucker-Drob and Bates, Psychol Sci:0956797615612727, 2015). The purpose of this meta-analysis was to average the etiological influences of reading comprehension and to explore the potential moderators influencing these estimates. Results revealed an average heritability estimate of h 2 = 0.59, with significant variation in estimates across studies, suggesting potential moderation. Moderation results indicated publication year, grade level, project, zygosity methods, and response type moderated heritability estimates. The average shared environmental estimate was c 2 = 0.16, with publication year, grade and zygosity methods acting as significant moderators. These findings support the role of genetics on reading comprehension, and a small significant role of shared environmental influences. The results suggest that our interpretation of how genes and environments influence reading comprehension should reflect aspects of study and sample.
Impact of comprehensive two-dimensional gas chromatography with mass spectrometry on food analysis.
Tranchida, Peter Q; Purcaro, Giorgia; Maimone, Mariarosa; Mondello, Luigi
2016-01-01
Comprehensive two-dimensional gas chromatography with mass spectrometry has been on the separation-science scene for about 15 years. This three-dimensional method has made a great positive impact on various fields of research, and among these that related to food analysis is certainly at the forefront. The present critical review is based on the use of comprehensive two-dimensional gas chromatography with mass spectrometry in the untargeted (general qualitative profiling and fingerprinting) and targeted analysis of food volatiles; attention is focused not only on its potential in such applications, but also on how recent advances in comprehensive two-dimensional gas chromatography with mass spectrometry will potentially be important for food analysis. Additionally, emphasis is devoted to the many instances in which straightforward gas chromatography with mass spectrometry is a sufficiently-powerful analytical tool. Finally, possible future scenarios in the comprehensive two-dimensional gas chromatography with mass spectrometry food analysis field are discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Urban Intensive Land-use Evaluation in Xi’an, Based on Fuzzy Comprehensive Evaluation
NASA Astrophysics Data System (ADS)
Shi, Ru; Kang, Zhiyuan
2018-01-01
The intensive land-use is the basis of urban “stock optimization”, and scientific and reasonable evaluation is the important content of the land-intensive utilization. In this paper, through the survey of Xi’an urban land-use condition, we construct the suitable evaluation index system of Xi’an’ intensive land-use, by using Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE) of combination. And through the analysis of the influencing factors of land-intensive utilization, we provide a reference for the future development direction.
Li, Yuanyuan; Xie, Yanming; Fu, Yingkun
2011-10-01
Currently massive researches have been launched about the safety, efficiency and economy of post-marketing Chinese patent medicine (CPM) proprietary Chinese medicine, but it was lack of a comprehensive interpretation. Establishing the risk evaluation index system and risk assessment model of CPM is the key to solve drug safety problems and protect people's health. The clinical risk factors of CPM exist similarities with the Western medicine, can draw lessons from foreign experience, but also have itself multi-factor multivariate multi-level complex features. Drug safety risk assessment for the uncertainty and complexity, using analytic hierarchy process (AHP) to empower the index weights, AHP-based fuzzy neural network to build post-marketing CPM risk evaluation index system and risk assessment model and constantly improving the application of traditional Chinese medicine characteristic is accord with the road and feasible beneficial exploration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rim, Jung H.; Kuhn, Kevin J.; Tandon, Lav
Nuclear forensics techniques, including micro-XRF, gamma spectrometry, trace elemental analysis and isotopic/chronometric characterization were used to interrogate two, potentially related plutonium metal foils. These samples were submitted for analysis with only limited production information, and a comprehensive suite of forensic analyses were performed. Resulting analytical data was paired with available reactor model and historical information to provide insight into the materials’ properties, origins, and likely intended uses. Both were super-grade plutonium, containing less than 3% 240Pu, and age-dating suggested that most recent chemical purification occurred in 1948 and 1955 for the respective metals. Additional consideration of reactor modelling feedback andmore » trace elemental observables indicate plausible U.S. reactor origin associated with the Hanford site production efforts. In conclusion, based on this investigation, the most likely intended use for these plutonium foils was 239Pu fission foil targets for physics experiments, such as cross-section measurements, etc.« less
Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.
Sakamoto, Takuto
2016-01-01
Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level.
Nature of motor control: perspectives and issues.
Turvey, Michael T; Fonseca, Sergio
2009-01-01
Four perspectives on motor control provide the framework for developing a comprehensive theory of motor control in biological systems. The four perspectives, of decreasing orthodoxy, are distinguished by their sources of inspiration: neuroanatomy, robotics, self-organization, and ecological realities. Twelve major issues that commonly constrain (either explicitly or implicitly) the understanding of the control and coordination of movement are identified and evaluated within the framework of the four perspectives. The issues are as follows: (1) Is control strictly neural? (2) Is there a divide between planning and execution? (3) Does control entail a frequently involved knowledgeable executive? (4) Do analytical internal models mediate control? (5) Is anticipation necessarily model dependent? (6) Are movements preassembled? (7) Are the participating components context independent? (8) Is force transmission strictly myotendinous? (9) Is afference a matter of local linear signaling? (10) Is neural noise an impediment? (11) Do standard variables (of mechanics and physiology) suffice? (12) Is the organization of control hierarchical?
Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery
Sakamoto, Takuto
2016-01-01
Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526
Identification of Upper and Lower Level Yield Strength in Materials
Valíček, Jan; Harničárová, Marta; Kopal, Ivan; Palková, Zuzana; Kušnerová, Milena; Panda, Anton; Šepelák, Vladimír
2017-01-01
This work evaluates the possibility of identifying mechanical parameters, especially upper and lower yield points, by the analytical processing of specific elements of the topography of surfaces generated with abrasive waterjet technology. We developed a new system of equations, which are connected with each other in such a way that the result of a calculation is a comprehensive mathematical–physical model, which describes numerically as well as graphically the deformation process of material cutting using an abrasive waterjet. The results of our model have been successfully checked against those obtained by means of a tensile test. The main prospect for future applications of the method presented in this article concerns the identification of mechanical parameters associated with the prediction of material behavior. The findings of this study can contribute to a more detailed understanding of the relationships: material properties—tool properties—deformation properties. PMID:28832526
Performance optimization of helicopter rotor blades
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.
1991-01-01
As part of a center-wide activity at NASA Langley Research Center to develop multidisciplinary design procedures by accounting for discipline interactions, a performance design optimization procedure is developed. The procedure optimizes the aerodynamic performance of rotor blades by selecting the point of taper initiation, root chord, taper ratio, and maximum twist which minimize hover horsepower while not degrading forward flight performance. The procedure uses HOVT (a strip theory momentum analysis) to compute the horse power required for hover and the comprehensive helicopter analysis program CAMRAD to compute the horsepower required for forward flight and maneuver. The optimization algorithm consists of the general purpose optimization program CONMIN and approximate analyses. Sensitivity analyses consisting of derivatives of the objective function and constraints are carried out by forward finite differences. The procedure is applied to a test problem which is an analytical model of a wind tunnel model of a utility rotor blade.
Identification of Upper and Lower Level Yield Strength in Materials.
Valíček, Jan; Harničárová, Marta; Kopal, Ivan; Palková, Zuzana; Kušnerová, Milena; Panda, Anton; Šepelák, Vladimír
2017-08-23
This work evaluates the possibility of identifying mechanical parameters, especially upper and lower yield points, by the analytical processing of specific elements of the topography of surfaces generated with abrasive waterjet technology. We developed a new system of equations, which are connected with each other in such a way that the result of a calculation is a comprehensive mathematical-physical model, which describes numerically as well as graphically the deformation process of material cutting using an abrasive waterjet. The results of our model have been successfully checked against those obtained by means of a tensile test. The main prospect for future applications of the method presented in this article concerns the identification of mechanical parameters associated with the prediction of material behavior. The findings of this study can contribute to a more detailed understanding of the relationships: material properties-tool properties-deformation properties.
Nature of Motor Control: Perspectives and Issues
Turvey, M. T.; Fonseca, Sergio
2013-01-01
Four perspectives on motor control provide the framework for developing a comprehensive theory of motor control in biological systems. The four perspectives, of decreasing orthodoxy, are distinguished by their sources of inspiration: neuroanatomy, robotics, self-organization, and ecological realities. Twelve major issues that commonly constrain (either explicitly or implicitly) the understanding of the control and coordination of movement are identified and evaluated within the framework of the four perspectives. The issues are as follows: (1) Is control strictly neural? (2) Is there a divide between planning and execution? (3) Does control entail a frequently involved knowledgeable executive? (4) Do analytical internal models mediate control? (5) Is anticipation necessarily model dependent? (6) Are movements preassembled? (7) Are the participating components context independent? (8) Is force transmission strictly myotendinous? (9) Is afference a matter of local linear signaling? (10) Is neural noise an impediment? (11) Do standard variables (of mechanics and physiology) suffice? (12) Is the organization of control hierarchical? PMID:19227497
Characteristic analysis on UAV-MIMO channel based on normalized correlation matrix.
Gao, Xi jun; Chen, Zi li; Hu, Yong Jiang
2014-01-01
Based on the three-dimensional GBSBCM (geometrically based double bounce cylinder model) channel model of MIMO for unmanned aerial vehicle (UAV), the simple form of UAV space-time-frequency channel correlation function which includes the LOS, SPE, and DIF components is presented. By the methods of channel matrix decomposition and coefficient normalization, the analytic formula of UAV-MIMO normalized correlation matrix is deduced. This formula can be used directly to analyze the condition number of UAV-MIMO channel matrix, the channel capacity, and other characteristic parameters. The simulation results show that this channel correlation matrix can be applied to describe the changes of UAV-MIMO channel characteristics under different parameter settings comprehensively. This analysis method provides a theoretical basis for improving the transmission performance of UAV-MIMO channel. The development of MIMO technology shows practical application value in the field of UAV communication.
Characteristic Analysis on UAV-MIMO Channel Based on Normalized Correlation Matrix
Xi jun, Gao; Zi li, Chen; Yong Jiang, Hu
2014-01-01
Based on the three-dimensional GBSBCM (geometrically based double bounce cylinder model) channel model of MIMO for unmanned aerial vehicle (UAV), the simple form of UAV space-time-frequency channel correlation function which includes the LOS, SPE, and DIF components is presented. By the methods of channel matrix decomposition and coefficient normalization, the analytic formula of UAV-MIMO normalized correlation matrix is deduced. This formula can be used directly to analyze the condition number of UAV-MIMO channel matrix, the channel capacity, and other characteristic parameters. The simulation results show that this channel correlation matrix can be applied to describe the changes of UAV-MIMO channel characteristics under different parameter settings comprehensively. This analysis method provides a theoretical basis for improving the transmission performance of UAV-MIMO channel. The development of MIMO technology shows practical application value in the field of UAV communication. PMID:24977185
Reddy, Sunita; Mary, Immaculate
2013-01-01
The Rajiv Aarogyasri Community Health Insurance (RACHI) in Andhra Pradesh (AP) has been very popular social insurance scheme with a private public partnership model to deal with the problems of catastrophic medical expenditures at tertiary level care for the poor households. A brief analysis of the RACHI scheme based on officially available data and media reports has been undertaken from a public health perspective to understand the nature and financing of partnership and the lessons it provides. The analysis of the annual budget spent on the surgeries in private hospitals compared to tertiary public hospitals shows that the current scheme is not sustainable and pose huge burden on the state exchequers. The private hospital association's in AP, further acts as pressure groups to increase the budget or threaten to withdraw services. Thus, profits are privatized and losses are socialized.
Reading comprehension of deaf students in regular education.
Luccas, Marcia Regina Zemella; Chiari, Brasília Maria; Goulart, Bárbara Niegia Garcia de
2012-01-01
To evaluate and compare the reading comprehension of deaf students included in regular classrooms of public schools with and without specialized educational support. Observational analytic study with 35 students with sensorineural hearing loss, with and without educational support. All subjects were assessed with the Word Reading Competence Test (WRCT), the Picture-Print Matching Test by Choice (PPMT-C), and the Sentence Reading Comprehension Test (SRCT). In the tests regarding comprehension of words (WRCT and PPMT-C), the results showed no difference in the performance of deaf students who attend and do not attend educational support. Regarding reading comprehension of sentences, the application of the SRCT also did not show differences between the groups of deaf students. A significant correlation was found between age and grade, indicating that the older the students and the higher their educational level, the better their performance in reading sentences. The results indicate that deaf students, regardless of attending educational support, read words better than sentences. There is no difference in reading comprehension between deaf students who receive and do not receive specialized pedagogical monitoring.
Hellmuth, Christian; Weber, Martina; Koletzko, Berthold; Peissner, Wolfgang
2012-02-07
Despite their central importance for lipid metabolism, straightforward quantitative methods for determination of nonesterified fatty acid (NEFA) species are still missing. The protocol presented here provides unbiased quantitation of plasma NEFA species by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Simple deproteination of plasma in organic solvent solution yields high accuracy, including both the unbound and initially protein-bound fractions, while avoiding interferences from hydrolysis of esterified fatty acids from other lipid classes. Sample preparation is fast and nonexpensive, hence well suited for automation and high-throughput applications. Separation of isotopologic NEFA is achieved using ultrahigh-performance liquid chromatography (UPLC) coupled to triple quadrupole LC-MS/MS detection. In combination with automated liquid handling, total assay time per sample is less than 15 min. The analytical spectrum extends beyond readily available NEFA standard compounds by a regression model predicting all the relevant analytical parameters (retention time, ion path settings, and response factor) of NEFA species based on chain length and number of double bonds. Detection of 50 NEFA species and accurate quantification of 36 NEFA species in human plasma is described, the highest numbers ever reported for a LC-MS application. Accuracy and precision are within widely accepted limits. The use of qualifier ions supports unequivocal analyte verification. © 2012 American Chemical Society
The Politics of American Education
ERIC Educational Resources Information Center
Spring, Joel
2010-01-01
Turning his distinctive analytical lens to the politics of American education, the author looks at contemporary educational policy issues from theoretical, practical, and historical perspectives. This comprehensive overview documents and explains who influences educational policy and how, bringing to life the realities of schooling in the 21st…
Critical Thinking for the New Millennium: A Pedagogical Imperative.
ERIC Educational Resources Information Center
Lee, Andrew Ann Dinkins
The pedagogical imperative to prepare students to become critical thinkers, critical readers, and critical writers for the coming millennium necessitates a comprehensive college discourse on critical thinking. The paper cites seminars and workshops that incorporate theoretical and practical dimensions of teaching critical-analytical thinking…
Nonlinear Response Of MSSS Bridges Under Earthquake Ground Motions: Case Studies
DOT National Transportation Integrated Search
1999-10-01
This report presents the results of the second phase of a comprehensive analytical study on the seismic response of highway bridges in New Jersey. The overall objective of this phase of the study was to evaluate the nonlinear seismic response of actu...
Thinking graphically: Connecting vision and cognition during graph comprehension.
Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A
2008-03-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved
Chang, Chia-Ling; Chao, Yu-Chi
2012-05-01
Every year, Taiwan endures typhoons and earthquakes; these natural hazards often induce landslides and debris flows. Therefore, watershed management strategies must consider the environmental vulnerabilities of local basins. Because many factors affect basin ecosystems, this study applied multiple criteria analysis and the analytical hierarchy process (AHP) to evaluate seven criteria in three phases (geographic phase, hydrologic phase, and societal phase). This study focused on five major basins in Taiwan: the Tan-Shui River Basin, the Ta-Chia River Basin, the Cho-Shui River Basin, the Tseng-Wen River Basin, and the Kao-Ping River Basin. The objectives were a comprehensive examination of the environmental characteristics of these basins and a comprehensive assessment of their environmental vulnerabilities. The results of a survey and AHP analysis showed that landslide area is the most important factor for basin environmental vulnerability. Of all these basins, the Cho-Shui River Basin in central Taiwan has the greatest environmental vulnerability.
Wang, Bing; Shen, Hao; Fang, Aiqin; Huang, De-Shuang; Jiang, Changjun; Zhang, Jun; Chen, Peng
2016-06-17
Comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC/TOF-MS) system has become a key analytical technology in high-throughput analysis. Retention index has been approved to be helpful for compound identification in one-dimensional gas chromatography, which is also true for two-dimensional gas chromatography. In this work, a novel regression model was proposed for calculating the second dimension retention index of target components where n-alkanes were used as reference compounds. This model was developed to depict the relationship among adjusted second dimension retention time, temperature of the second dimension column and carbon number of n-alkanes by an exponential nonlinear function with only five parameters. Three different criteria were introduced to find the optimal values of parameters. The performance of this model was evaluated using experimental data of n-alkanes (C7-C31) at 24 temperatures which can cover all 0-6s adjusted retention time area. The experimental results show that the mean relative error between predicted adjusted retention time and experimental data of n-alkanes was only 2%. Furthermore, our proposed model demonstrates a good extrapolation capability for predicting adjusted retention time of target compounds which located out of the range of the reference compounds in the second dimension adjusted retention time space. Our work shows the deviation was less than 9 retention index units (iu) while the number of alkanes were added up to 5. The performance of our proposed model has also been demonstrated by analyzing a mixture of compounds in temperature programmed experiments. Copyright © 2016 Elsevier B.V. All rights reserved.
Chen, Shih-Chih; Liu, Shih-Chi; Li, Shing-Han; Yen, David C
2013-12-01
This study extends the Technology Acceptance Model (TAM) by incorporating relationship quality as a mediator to construct a comprehensive framework for understanding the influence on continuance intention in the hospital e-appointment system. A survey of 334 Taiwanese citizens who were contacted via phone or the Internet and Structural Equation Modeling (SEM) is used for path analysis and hypothesis tests. The study shows that perceived ease of use (PEOU) and perceived usefulness (PU) have significant influence on continuance intention through the mediation of relationship quality, consisting of satisfaction and trust. The direct impact of relationship quality on continuance intention is also significant. The analytical results reveal that the relationship between the hospital, patients and e-appointment users can be improved via enhancing the continued usage of e-appointment. This paper also proposes a general model to synthesize the essence of PEOU, PU, and relationship quality for explaining users' continuous intention of e-appointment.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
Harikrishnan, A R; Dhar, Purbarun; Gedupudi, Sateesh; Das, Sarit K
2018-04-12
We propose a comprehensive analysis and a quasi-analytical mathematical formalism to predict the surface tension and contact angles of complex surfactant-infused nanocolloids. The model rests on the foundations of the interaction potentials for the interfacial adsorption-desorption dynamics in complex multicomponent colloids. Surfactant-infused nanoparticle-laden interface problems are difficult to deal with because of the many-body interactions and interfaces involved at the meso-nanoscales. The model is based on the governing role of thermodynamic and chemical equilibrium parameters in modulating the interfacial energies. The influence of parameters such as the presence of surfactants, nanoparticles, and surfactant-capped nanoparticles on interfacial dynamics is revealed by the analysis. Solely based on the knowledge of interfacial properties of independent surfactant solutions and nanocolloids, the same can be deduced for complex surfactant-based nanocolloids through the proposed approach. The model accurately predicts the equilibrium surface tension and contact angle of complex nanocolloids available in the existing literature and present experimental findings.
Study on Influencing Factor Analysis and Application of Consumer Mobile Commerce Acceptance
NASA Astrophysics Data System (ADS)
Li, Gaoguang; Lv, Tingjie
Mobile commerce (MC) refers to e-commerce activities carried out using a mobile device such as a phone or PDA. With new technology, MC will be rapidly growing in the near future. At the present time, what factors making consumer accept MC and what MC applications are acceptable by consumers are two of hot issues both for MC providers and f or MC researchers. This study presents a proposed MC acceptance model that integrates perceived playfulness, perceived risk and cost into the TAM to study which factors affect consumer MC acceptance. The proposed model includes five variables, namely perceived risk, cost, perceived usefulness, perceived playfulness, perceived ease of use, perceived playfulness. Then, using analytic hierarchy process (AHP) to calculate weight of criteria involved in proposed model. Finally, the study utilizes fuzzy comprehensive evaluation method to evaluate MC applications accepted possibility, and then a MC application is empirically tested using data collected from a survey of MC consumers.
NASA Astrophysics Data System (ADS)
Chen, Xinzhong; Lo, Chiu Fan Bowen; Zheng, William; Hu, Hai; Dai, Qing; Liu, Mengkun
2017-11-01
Over the last decade, scattering-type scanning near-field optical microscopy and spectroscopy have been widely used in nano-photonics and material research due to their fine spatial resolution and broad spectral range. A number of simplified analytical models have been proposed to quantitatively understand the tip-scattered near-field signal. However, a rigorous interpretation of the experimental results is still lacking at this stage. Numerical modelings, on the other hand, are mostly done by simulating the local electric field slightly above the sample surface, which only qualitatively represents the near-field signal rendered by the tip-sample interaction. In this work, we performed a more comprehensive numerical simulation which is based on realistic experimental parameters and signal extraction procedures. By directly comparing to the experiments as well as other simulation efforts, our methods offer a more accurate quantitative description of the near-field signal, paving the way for future studies of complex systems at the nanoscale.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
Melby-Lervåg, Monica; Lervåg, Arne
2014-03-01
We report a systematic meta-analytic review of studies comparing reading comprehension and its underlying components (language comprehension, decoding, and phonological awareness) in first- and second-language learners. The review included 82 studies, and 576 effect sizes were calculated for reading comprehension and underlying components. Key findings were that, compared to first-language learners, second-language learners display a medium-sized deficit in reading comprehension (pooled effect size d = -0.62), a large deficit in language comprehension (pooled effect size d = -1.12), but only small differences in phonological awareness (pooled effect size d = -0.08) and decoding (pooled effect size d = -0.12). A moderator analysis showed that characteristics related to the type of reading comprehension test reliably explained the variation in the differences in reading comprehension between first- and second-language learners. For language comprehension, studies of samples from low socioeconomic backgrounds and samples where only the first language was used at home generated the largest group differences in favor of first-language learners. Test characteristics and study origin reliably contributed to the variations between the studies of language comprehension. For decoding, Canadian studies showed group differences in favor of second-language learners, whereas the opposite was the case for U.S. studies. Regarding implications, unless specific decoding problems are detected, interventions that aim to ameliorate reading comprehension problems among second-language learners should focus on language comprehension skills.
Tighe, Elizabeth L; Schatschneider, Christopher
2016-07-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82%-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. © Hammill Institute on Disabilities 2014.
Lloret, Juan; Morthier, Geert; Ramos, Francisco; Sales, Salvador; Van Thourhout, Dries; Spuesens, Thijs; Olivier, Nicolas; Fédéli, Jean-Marc; Capmany, José
2012-05-07
A broadband microwave photonic phase shifter based on a single III-V microdisk resonator heterogeneously integrated on and coupled to a nanophotonic silicon-on-insulator waveguide is reported. The phase shift tunability is accomplished by modifying the effective index through carrier injection. A comprehensive semi-analytical model aiming at predicting its behavior is formulated and confirmed by measurements. Quasi-linear and continuously tunable 2π phase shifts at radiofrequencies greater than 18 GHz are experimentally demonstrated. The phase shifter performance is also evaluated when used as a key element in tunable filtering schemes. Distortion-free and wideband filtering responses with a tuning range of ~100% over the free spectral range are obtained.
Research study on high energy radiation effect and environment solar cell degradation methods
NASA Technical Reports Server (NTRS)
Horne, W. E.; Wilkinson, M. C.
1974-01-01
The most detailed and comprehensively verified analytical model was used to evaluate the effects of simplifying assumptions on the accuracy of predictions made by the external damage coefficient method. It was found that the most serious discrepancies were present in heavily damaged cells, particularly proton damaged cells, in which a gradient in damage across the cell existed. In general, it was found that the current damage coefficient method tends to underestimate damage at high fluences. An exception to this rule was thick cover-slipped cells experiencing heavy degradation due to omnidirectional electrons. In such cases, the damage coefficient method overestimates the damage. Comparisons of degradation predictions made by the two methods and measured flight data confirmed the above findings.
Damage Arresting Composites for Shaped Vehicles
NASA Technical Reports Server (NTRS)
Velicki, Alex
2009-01-01
This report describes the development of a novel structural solution that addresses the demanding fuselage loading requirements for the Hybrid Wing or Blended Wing Body configurations that are described in NASA NRA subtopic A2A.3, "Materials and Structures for Wing Components and Non-Circular Fuselage." The phase I portion of this task includes a comprehensive finite element model-based structural sizing exercise performed using the BWB airplane configuration to generate internal loads and fuselage panel weights for an advanced Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) structural concept. An accompanying element-level test program is also described which substantiates the analytical results and calculation methods used in the trade study. The phase II plan for the continuation of this research is also included herein.
Geometric quantification of features in large flow fields.
Kendall, Wesley; Huang, Jian; Peterka, Tom
2012-01-01
Interactive exploration of flow features in large-scale 3D unsteady-flow data is one of the most challenging visualization problems today. To comprehensively explore the complex feature spaces in these datasets, a proposed system employs a scalable framework for investigating a multitude of characteristics from traced field lines. This capability supports the examination of various neighborhood-based geometric attributes in concert with other scalar quantities. Such an analysis wasn't previously possible because of the large computational overhead and I/O requirements. The system integrates visual analytics methods by letting users procedurally and interactively describe and extract high-level flow features. An exploration of various phenomena in a large global ocean-modeling simulation demonstrates the approach's generality and expressiveness as well as its efficacy.
MIPSPlantsDB—plant database resource for integrative and comparative plant genome research
Spannagl, Manuel; Noubibou, Octave; Haase, Dirk; Yang, Li; Gundlach, Heidrun; Hindemitt, Tobias; Klee, Kathrin; Haberer, Georg; Schoof, Heiko; Mayer, Klaus F. X.
2007-01-01
Genome-oriented plant research delivers rapidly increasing amount of plant genome data. Comprehensive and structured information resources are required to structure and communicate genome and associated analytical data for model organisms as well as for crops. The increase in available plant genomic data enables powerful comparative analysis and integrative approaches. PlantsDB aims to provide data and information resources for individual plant species and in addition to build a platform for integrative and comparative plant genome research. PlantsDB is constituted from genome databases for Arabidopsis, Medicago, Lotus, rice, maize and tomato. Complementary data resources for cis elements, repetive elements and extensive cross-species comparisons are implemented. The PlantsDB portal can be reached at . PMID:17202173
Kline, Kimberly N
2007-01-01
This study discusses the implications for cultural sensitivity of the rhetorical choices in breast cancer education materials developed specifically for African American audiences by national organizations. Using the PEN-3 model of cultural sensitivity as an analytic framework for a generative rhetorical criticism, this study revealed that adaptations have been made in some pamphlets to acknowledge African American cultural values related to community, self-reliance, spirituality, and distrust of the Western medical establishment, but many messages could be revised to achieve a more comprehensive, balanced, accurate, and audience-specific discussion of the breast cancer issue. Achieving cultural sensitivity in health promotion materials necessitates attention to nuanced meanings in messages, revision of questionable arguments and evidence, and avoidance of ambiguity.
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Zhou, Ronggang; Chan, Alan H S
2017-01-01
In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.
NASA Technical Reports Server (NTRS)
Nieves-Chinchilla, T.; Colaninno, R.; Vourlidas, A.; Szabo, A.; Lepping, R. P.; Boardsen, S. A.; Anderson, B. J.; Korth, H.
2012-01-01
During June 16-21, 2010, an Earth-directed Coronal Mass Ejection (CME) event was observed by instruments onboard STEREO, SOHO, MESSENGER and Wind. This event was the first direct detection of a rotating CME in the middle and outer corona. Here, we carry out a comprehensive analysis of the evolution of the CME in the interplanetary medium comparing in-situ and remote observations, with analytical models and three-dimensional reconstructions. In particular, we investigate the parallel and perpendicular cross section expansion of the CME from the corona through the heliosphere up to 1 AU. We use height-time measurements and the Gradual Cylindrical Shell (GCS) technique to model the imaging observations, remove the projection effects, and derive the 3-dimensional extent of the event. Then, we compare the results with in-situ analytical Magnetic Cloud (MC) models, and with geometrical predictions from past works. We nd that the parallel (along the propagation plane) cross section expansion agrees well with the in-situ model and with the Bothmer & Schwenn [1998] empirical relationship based on in-situ observations between 0.3 and 1 AU. Our results effectively extend this empirical relationship to about 5 solar radii. The expansion of the perpendicular diameter agrees very well with the in-situ results at MESSENGER ( 0:5 AU) but not at 1 AU. We also find a slightly different, from Bothmer & Schwenn [1998], empirical relationship for the perpendicular expansion. More importantly, we find no evidence that the CME undergoes a significant latitudinal over-expansion as it is commonly assumed
A Comparative Analysis of Prenatal Care and Fetal Growth in Eight South American Countries
Woodhouse, Cristina; Lopez Camelo, Jorge; Wehby, George L.
2014-01-01
There has been little work that comprehensively compared the relationship between prenatal care and infant health across multiple countries using similar data sources and analytical models. Such comparative analyses are useful for understanding the background of differences in infant health between populations. We evaluated the association between prenatal care visits and fetal growth measured by birth weight (BW) in grams or low birth weight (<2500 grams; LBW) adjusted for gestational age in eight South American countries using similarly collected data across countries and the same analytical models. OLS and logistic regressions were estimated adjusting for a large set of relevant infant, maternal, and household characteristics and birth year and hospital fixed effects. Birth data were acquired from 140 hospitals that are part of the Latin American Collaborative Study of Congenital Malformations (ECLAMC) network. The analytical sample included 56,014 live-born infants (∼69% of total sample) with complete data born without congenital anomalies in the years 1996–2011 in Brazil, Argentina, Chile, Venezuela, Ecuador, Colombia, Bolivia, and Uruguay. Prenatal care visits were significantly (at p<.05) and positively associated with BW and negatively associated with LBW for all countries. The OLS coefficients ranged from 9 grams per visit in Bolivia to 36 grams in Uruguay. The association with LBW was strongest for Chile (OR = 0.87 per visit) and lowest for Argentina and Venezuela (OR = 0.95). The association decreased in the recent decade compared to earlier years. Our findings suggest that estimates of association between prenatal care and fetal growth are population-specific and may not be generalizable to other populations. Furthermore, as one of the indicators for a country’s healthcare system for maternal and child health, prenatal care is a highly variable indicator between countries in South America. PMID:24625630
DOT National Transportation Integrated Search
1997-10-01
The overall goals in this project were to perform literature reviews and syntheses, using meta-analytic techniques, where appropriate, for a broad and comprehensive body of research findings on older driver needs and (diminished) capabilities, and a ...
DOT National Transportation Integrated Search
1999-11-23
The overall goals in this project were to perform literature reviews and syntheses, using meta-analytic techniques, where appropriate, for a broad and comprehensive body of research findings on older driver needs and (diminished) capabilities, and a ...
Deriving Appropriate Educational Program Costs in Illinois.
ERIC Educational Resources Information Center
Parrish, Thomas B.; Chambers, Jay G.
This document describes the comprehensive analytical framework for school finance used by the Illinois State Board of Education to assist policymakers in their decisions about equitable distribution of state aid and appropriate levels of resources to meet the varying educational requirements of differing student populations. This framework, the…
The report gives results of pilot-scale incineration testing to develop a comprehensive list of products of incomplete combustion (PICs) from hazardous waste combustion (HWC) systems. Project goals were to: (1) identify the total mass of organic compounds sufficiently to estimate...
Using Analytical Techniques to Interpret Financial Statements.
ERIC Educational Resources Information Center
Walters, Donald L.
1986-01-01
Summarizes techniques for interpreting the balance sheet and the statement of revenues, expenditures, and changes-in-fund-balance sections of the comprehensive annual financial report required of all school districts. Uses three tables to show intricacies involved and focuses on analyzing favorable and unfavorable budget variances. (MLH)
ERIC Educational Resources Information Center
Ackoff, Russell L.
1974-01-01
The major organizational and social problems of our time do not lend themselves to the reductionism of traditional analytical and disciplinary approaches. They must be attacked holistically, with a comprehensive systems approach. The effective study of large-scale social systems requires the synthesis of science with the professions that use it.…
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
Dubský, Pavel; Müllerová, Ludmila; Dvořák, Martin; Gaš, Bohuslav
2015-03-06
The model of electromigration of a multivalent weak acidic/basic/amphoteric analyte that undergoes complexation with a mixture of selectors is introduced. The model provides an extension of the series of models starting with the single-selector model without dissociation by Wren and Rowe in 1992, continuing with the monovalent weak analyte/single-selector model by Rawjee, Williams and Vigh in 1993 and that by Lelièvre in 1994, and ending with the multi-selector overall model without dissociation developed by our group in 2008. The new multivalent analyte multi-selector model shows that the effective mobility of the analyte obeys the original Wren and Row's formula. The overall complexation constant, mobility of the free analyte and mobility of complex can be measured and used in a standard way. The mathematical expressions for the overall parameters are provided. We further demonstrate mathematically that the pH dependent parameters for weak analytes can be simply used as an input into the multi-selector overall model and, in reverse, the multi-selector overall parameters can serve as an input into the pH-dependent models for the weak analytes. These findings can greatly simplify the rationale method development in analytical electrophoresis, specifically enantioseparations. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Trombetti, Tomaso
This thesis presents an Experimental/Analytical approach to modeling and calibrating shaking tables for structural dynamic applications. This approach was successfully applied to the shaking table recently built in the structural laboratory of the Civil Engineering Department at Rice University. This shaking table is capable of reproducing model earthquake ground motions with a peak acceleration of 6 g's, a peak velocity of 40 inches per second, and a peak displacement of 3 inches, for a maximum payload of 1500 pounds. It has a frequency bandwidth of approximately 70 Hz and is designed to test structural specimens up to 1/5 scale. The rail/table system is mounted on a reaction mass of about 70,000 pounds consisting of three 12 ft x 12 ft x 1 ft reinforced concrete slabs, post-tensioned together and connected to the strong laboratory floor. The slip table is driven by a hydraulic actuator governed by a 407 MTS controller which employs a proportional-integral-derivative-feedforward-differential pressure algorithm to control the actuator displacement. Feedback signals are provided by two LVDT's (monitoring the slip table relative displacement and the servovalve main stage spool position) and by one differential pressure transducer (monitoring the actuator force). The dynamic actuator-foundation-specimen system is modeled and analyzed by combining linear control theory and linear structural dynamics. The analytical model developed accounts for the effects of actuator oil compressibility, oil leakage in the actuator, time delay in the response of the servovalve spool to a given electrical signal, foundation flexibility, and dynamic characteristics of multi-degree-of-freedom specimens. In order to study the actual dynamic behavior of the shaking table, the transfer function between target and actual table accelerations were identified using experimental results and spectral estimation techniques. The power spectral density of the system input and the cross power spectral density of the table input and output were estimated using the Bartlett's spectral estimation method. The experimentally-estimated table acceleration transfer functions obtained for different working conditions are correlated with their analytical counterparts. As a result of this comprehensive correlation study, a thorough understanding of the shaking table dynamics and its sensitivities to control and payload parameters is obtained. Moreover, the correlation study leads to a calibrated analytical model of the shaking table of high predictive ability. It is concluded that, in its present conditions, the Rice shaking table is able to reproduce, with a high degree of accuracy, model earthquake accelerations time histories in the frequency bandwidth from 0 to 75 Hz. Furthermore, the exhaustive analysis performed indicates that the table transfer function is not significantly affected by the presence of a large (in terms of weight) payload with a fundamental frequency up to 20 Hz. Payloads having a higher fundamental frequency do affect significantly the shaking table performance and require a modification of the table control gain setting that can be easily obtained using the predictive analytical model of the shaking table. The complete description of a structural dynamic experiment performed using the Rice shaking table facility is also reported herein. The object of this experimentation was twofold: (1) to verify the testing capability of the shaking table and, (2) to experimentally validate a simplified theory developed by the author, which predicts the maximum rotational response developed by seismic isolated building structures characterized by non-coincident centers of mass and rigidity, when subjected to strong earthquake ground motions.
Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration
NASA Technical Reports Server (NTRS)
Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.
1993-01-01
Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.
Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.
Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen
2015-10-01
Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.
Application of Out-of-Plane Warping to Control Rotor Blade Twist
NASA Technical Reports Server (NTRS)
VanWeddingen, Yannick; Bauchau, Olivier; Kottapalli, Sesi; Ozbay, Serkan; Mehrotra, Yogesh
2012-01-01
The goal of this ongoing study is to develop and demonstrate the feasibility of a blade actuation system to dynamically change the twist, and/or the camber, of an airfoil section and, consequently, alter the in-flight aerodynamic loading on the blade for efficient flight control. The required analytical and finite element tools are under development to enable an accurate and comprehensive aeroelastic assessment of the current Full-Blade Warping and 3D Warping Actuated Trailing Edge Flap concepts. The feasibility of the current concepts for swashplateless rotors and higher harmonic blade control is also being investigated. In particular, the aim is to complete the following objectives, some of which have been completed (as noted below) and others that are currently ongoing: i) Develop a Vlasov finite element model and validate against the ABAQUS shell models (completed). ii) Implement the 3D warping actuation concept within the comprehensive analysis code DYMORE. iii) Perform preliminary aeroelastic simulations of blades using DYMORE with 3D warping actuation: a) Investigate the blade behavior under 1 per/rev actuation. Determine whether sufficient twist can be generated and sustained to achieve primary blade control. b) Investigate the behavior of a trailing edge flap configuration under higher harmonic excitations. Determine how much twist can be obtained at the harmonics 2-5 per/rev. iv) Determine actuator specifications such as the power required, load and displacements, and identify the stress and strain distributions in the actuated blades. In general, the completion of Item ii) above will give an additional research capability in rotorcraft dynamics analyses, i.e., the capability to calculate the rotor blade twist due to warping, something that is not currently available in any of the existing comprehensive rotorcraft analyses.
NASA Astrophysics Data System (ADS)
Zhou, Shiqi; Zhou, Run
2017-08-01
Using the TL (Tang and Lu, 1993) method, Ornstein-Zernike integral equation is solved perturbatively under the mean spherical approximation (MSA) for fluid with potential consisting of a hard sphere plus square-well plus square-shoulder (HS + SW + SS) to obtain first-order analytic expressions of radial distribution function (RDF), second-order direct correlation function, and semi-analytic expressions for common thermodynamic properties. A comprehensive comparison between the first-order MSA and high temperature series expansion (HTSE) to third-, fifth- and seventh-order is performed over a wide parameter range for both a HS + SW and the HS + SW + SS model fluids by using corresponding ;exact; Monte Carlo results as a reference; although the HTSE is carried out up to seventh-order, and not to the first order as the first-order MSA the comparison is considered fair from a calculation complexity perspective. It is found that the performance of the first-order MSA is dramatically model-dependent: as target potentials go from the HS + SW to the HS + SW + SS, (i) there is a dramatic dropping of performance of the first-order MSA expressions in calculating the thermodynamic properties, especially both the excess internal energy and constant volume excess heat capacity of the HS + SW + SS model cannot be predicted even qualitatively correctly. (ii) One tendency is noticed that the first-order MSA gets more reliable with increasing temperatures in dealing with the pressure, excess Helmholtz free energy, excess enthalpy and excess chemical potential. (iii) Concerning the RDF, the first-order MSA is not as disappointing as it displays in the cases of thermodynamics. (iv) In the case of the HS + SW model, the first-order MSA solution is shown to be quantitatively correct in calculating the pressure and excess chemical potential even if the reduced temperatures are as low as 0.8. On the other hand, the seventh-order HTSE is less model-dependent; in most cases of the HS + SW and the HS + SW + SS models, the seventh-order HTSE improves the fifth- and third-order HTSE in both thermodynamic properties and RDF, and the improvements are very demonstrable in both the excess internal energy and constant volume excess heat capacity; for very limited cases, the seventh-order HTSE improves the fifth-order HTSE only within lower density domain and even shows a bit of inadaptation over higher density domain.
Structural model for fluctuations in financial markets
NASA Astrophysics Data System (ADS)
Anand, Kartik; Khedair, Jonathan; Kühn, Reimer
2018-05-01
In this paper we provide a comprehensive analysis of a structural model for the dynamics of prices of assets traded in a market which takes the form of an interacting generalization of the geometric Brownian motion model. It is formally equivalent to a model describing the stochastic dynamics of a system of analog neurons, which is expected to exhibit glassy properties and thus many metastable states in a large portion of its parameter space. We perform a generating functional analysis, introducing a slow driving of the dynamics to mimic the effect of slowly varying macroeconomic conditions. Distributions of asset returns over various time separations are evaluated analytically and are found to be fat-tailed in a manner broadly in line with empirical observations. Our model also allows us to identify collective, interaction-mediated properties of pricing distributions and it predicts pricing distributions which are significantly broader than their noninteracting counterparts, if interactions between prices in the model contain a ferromagnetic bias. Using simulations, we are able to substantiate one of the main hypotheses underlying the original modeling, viz., that the phenomenon of volatility clustering can be rationalized in terms of an interplay between the dynamics within metastable states and the dynamics of occasional transitions between them.
MO-AB-BRA-02: A Novel Scatter Imaging Modality for Real-Time Image Guidance During Lung SBRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redler, G; Bernard, D; Templeton, A
2015-06-15
Purpose: A novel scatter imaging modality is developed and its feasibility for image-guided radiation therapy (IGRT) during stereotactic body radiation therapy (SBRT) for lung cancer patients is assessed using analytic and Monte Carlo models as well as experimental testing. Methods: During treatment, incident radiation interacts and scatters from within the patient. The presented methodology forms an image of patient anatomy from the scattered radiation for real-time localization of the treatment target. A radiographic flat panel-based pinhole camera provides spatial information regarding the origin of detected scattered radiation. An analytical model is developed, which provides a mathematical formalism for describing themore » scatter imaging system. Experimental scatter images are acquired by irradiating an object using a Varian TrueBeam accelerator. The differentiation between tissue types is investigated by imaging simple objects of known compositions (water, lung, and cortical bone equivalent). A lung tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is fabricated and imaged to investigate image quality for various quantities of delivered radiation. Monte Carlo N-Particle (MCNP) code is used for validation and testing by simulating scatter image formation using the experimental pinhole camera setup. Results: Analytical calculations, MCNP simulations, and experimental results when imaging the water, lung, and cortical bone equivalent objects show close agreement, thus validating the proposed models and demonstrating that scatter imaging differentiates these materials well. Lung tumor phantom images have sufficient contrast-to-noise ratio (CNR) to clearly distinguish tumor from surrounding lung tissue. CNR=4.1 and CNR=29.1 for 10MU and 5000MU images (equivalent to 0.5 and 250 second images), respectively. Conclusion: Lung SBRT provides favorable treatment outcomes, but depends on accurate target localization. A comprehensive approach, employing multiple simulation techniques and experiments, is taken to demonstrate the feasibility of a novel scatter imaging modality for the necessary real-time image guidance.« less
Predictive Biomarkers for Linking Disease Pathology and Drug Effect.
Mayer, Bernd; Heinzel, Andreas; Lukas, Arno; Perco, Paul
2017-01-01
Productivity in drug R&D continues seeing significant attrition in clinical stage testing. Approval of new molecular entities proceeds with slow pace specifically when it comes to chronic, age-related diseases, calling for new conceptual approaches, methodological implementation and organizational adoption in drug development. Detailed phenotyping of disease presentation together with comprehensive representation of drug mechanism of action is considered as a path forward, and a big data spectrum has become available covering behavioral, clinical and molecular characteristics, the latter combining reductionist and explorative strategies. On this basis integrative analytics in the realm of Systems Biology has emerged, essentially aiming at traversing associations into causal relationships for bridging molecular disease specifics and clinical phenotype surrogates and finally explaining drug response and outcome. From a conceptual perspective bottom-up modeling approaches are available, with dynamical hierarchies as formalism capable of describing clinical findings as emergent properties of an underlying molecular process network comprehensively resembling disease pathology. In such representation biomarker candidates serve as proxy of a molecular process set, at the interface of a corresponding representation of drug mechanism of action allowing patient stratification and prediction of drug response. In practical implementation network analytics on a protein coding gene level has provided a number of example cases for matching disease presentation and drug molecular effect, and workflows combining computational hypothesis generation and experimental evaluation have become available for systematically optimizing biomarker candidate selection. With biomarker-based enrichment strategies in adaptive clinical trials, implementation routes for tackling development attrition are provided. Predictive biomarkers add precision in drug development and as companion diagnostics in clinical practice. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Evans, J. D.; Hao, W.; Chettri, S. R.
2014-12-01
Disaster risk management has grown to rely on earth observations, multi-source data analysis, numerical modeling, and interagency information sharing. The practice and outcomes of disaster risk management will likely undergo further change as several emerging earth science technologies come of age: mobile devices; location-based services; ubiquitous sensors; drones; small satellites; satellite direct readout; Big Data analytics; cloud computing; Web services for predictive modeling, semantic reconciliation, and collaboration; and many others. Integrating these new technologies well requires developing and adapting them to meet current needs; but also rethinking current practice to draw on new capabilities to reach additional objectives. This requires a holistic view of the disaster risk management enterprise and of the analytical or operational capabilities afforded by these technologies. One helpful tool for this assessment, the GEOSS Architecture for the Use of Remote Sensing Products in Disaster Management and Risk Assessment (Evans & Moe, 2013), considers all phases of the disaster risk management lifecycle for a comprehensive set of natural hazard types, and outlines common clusters of activities and their use of information and computation resources. We are using these architectural views, together with insights from current practice, to highlight effective, interrelated roles for emerging earth science technologies in disaster risk management. These roles may be helpful in creating roadmaps for research and development investment at national and international levels.
Kennedy, Bernice Roberts; Jenkins, Chalice C
2011-01-01
African American women, including adolescents and adults, are disproportionately affected by the transmission of Human Immunodeficiency Virus (HIV) and Acquired Immunodeficiency Syndrome (AIDS). HIV/AID is a health disparity issue for African American females in comparison to other ethnic groups. According to data acquired from 33 states in 2005, 64% of women who have HIV/ AIDS are African American women. It is estimated that during 2001-2004, 61% of African Americans under the age of 25 had been living with HIV/AIDS. This article is an analytical review of the literature emphasizing sexual assertiveness of African American women and the gap that exists in research literature on this population. The multifaceted model of HIV risk posits that an interpersonal predictor of risky sexual behavior is sexual assertiveness. The critical themes extracted from a review of the literature reveal the following: (a) sexual assertiveness is related to HIV risk in women, (b) sexual assertiveness and sexual communication are related, and (c) women with low sexual assertiveness are at increased risk of HIV As a result of this comprehensive literature, future research studies need to use models in validating sexual assertiveness interventions in reducing the risk of HIV/AIDS in African American women. HIV/AIDs prevention interventions or future studies need to target reducing the risk factors of HIV/AIDS of African Americans focusing on gender and culture-specific strategies.
A comprehensive analysis of the evaporation of a liquid spherical drop.
Sobac, B; Talbot, P; Haut, B; Rednikov, A; Colinet, P
2015-01-15
In this paper, a new comprehensive analysis of a suspended drop of a pure liquid evaporating into air is presented. Based on mass and energy conservation equations, a quasi-steady model is developed including diffusive and convective transports, and considering the non-isothermia of the gas phase. The main original feature of this simple analytical model lies in the consideration of the local dependence of the physico-chemical properties of the gas on the gas temperature, which has a significant influence on the evaporation process at high temperatures. The influence of the atmospheric conditions on the interfacial evaporation flux, molar fraction and temperature is investigated. Simplified versions of the model are developed to highlight the key mechanisms governing the evaporation process. For the conditions considered in this work, the convective transport appears to be opposed to the evaporation process leading to a decrease of the evaporation flux. However, this effect is relatively limited, the Péclet numbers happening to be small. In addition, the gas isothermia assumption never appears to be valid here, even at room temperature, due to the large temperature gradient that develops in the gas phase. These two conclusions are explained by the fact that heat transfer from the gas to the liquid appears to be the step limiting the evaporation process. Regardless of the complexity of the developed model, yet excluding extremely small droplets, the square of the drop radius decreases linearly over time (R(2) law). The assumptions of the model are rigorously discussed and general criteria are established, independently of the liquid-gas couple considered. Copyright © 2014 Elsevier Inc. All rights reserved.
Multidisciplinary Optimization of Tilt Rotor Blades Using Comprehensive Composite Modeling Technique
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; McCarthy, Thomas R.; Rajadas, John N.
1997-01-01
An optimization procedure is developed for addressing the design of composite tilt rotor blades. A comprehensive technique, based on a higher-order laminate theory, is developed for the analysis of the thick composite load-carrying sections, modeled as box beams, in the blade. The theory, which is based on a refined displacement field, is a three-dimensional model which approximates the elasticity solution so that the beam cross-sectional properties are not reduced to one-dimensional beam parameters. Both inplane and out-of-plane warping are included automatically in the formulation. The model can accurately capture the transverse shear stresses through the thickness of each wall while satisfying stress free boundary conditions on the inner and outer surfaces of the beam. The aerodynamic loads on the blade are calculated using the classical blade element momentum theory. Analytical expressions for the lift and drag are obtained based on the blade planform with corrections for the high lift capability of rotor blades. The aerodynamic analysis is coupled with the structural model to formulate the complete coupled equations of motion for aeroelastic analyses. Finally, a multidisciplinary optimization procedure is developed to improve the aerodynamic, structural and aeroelastic performance of the tilt rotor aircraft. The objective functions include the figure of merit in hover and the high speed cruise propulsive efficiency. Structural, aerodynamic and aeroelastic stability criteria are imposed as constraints on the problem. The Kreisselmeier-Steinhauser function is used to formulate the multiobjective function problem. The search direction is determined by the Broyden-Fletcher-Goldfarb-Shanno algorithm. The optimum results are compared with the baseline values and show significant improvements in the overall performance of the tilt rotor blade.
Du, Lihong; White, Robert L
2009-02-01
A previously proposed partition equilibrium model for quantitative prediction of analyte response in electrospray ionization mass spectrometry is modified to yield an improved linear relationship. Analyte mass spectrometer response is modeled by a competition mechanism between analyte and background electrolytes that is based on partition equilibrium considerations. The correlation between analyte response and solution composition is described by the linear model over a wide concentration range and the improved model is shown to be valid for a wide range of experimental conditions. The behavior of an analyte in a salt solution, which could not be explained by the original model, is correctly predicted. The ion suppression effects of 16:0 lysophosphatidylcholine (LPC) on analyte signals are attributed to a combination of competition for excess charge and reduction of total charge due to surface tension effects. In contrast to the complicated mathematical forms that comprise the original model, the simplified model described here can more easily be employed to predict analyte mass spectrometer responses for solutions containing multiple components. Copyright (c) 2008 John Wiley & Sons, Ltd.
Chapter 16 - Predictive Analytics for Comprehensive Energy Systems State Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yingchen; Yang, Rui; Hodge, Brian S
Energy sustainability is a subject of concern to many nations in the modern world. It is critical for electric power systems to diversify energy supply to include systems with different physical characteristics, such as wind energy, solar energy, electrochemical energy storage, thermal storage, bio-energy systems, geothermal, and ocean energy. Each system has its own range of control variables and targets. To be able to operate such a complex energy system, big-data analytics become critical to achieve the goal of predicting energy supplies and consumption patterns, assessing system operation conditions, and estimating system states - all providing situational awareness to powermore » system operators. This chapter presents data analytics and machine learning-based approaches to enable predictive situational awareness of the power systems.« less
Analytical functions to predict cosmic-ray neutron spectra in the atmosphere.
Sato, Tatsuhiko; Niita, Koji
2006-09-01
Estimation of cosmic-ray neutron spectra in the atmosphere has been an essential issue in the evaluation of the aircrew doses and the soft-error rates of semiconductor devices. We therefore performed Monte Carlo simulations for estimating neutron spectra using the PHITS code in adopting the nuclear data library JENDL-High-Energy file. Excellent agreements were observed between the calculated and measured spectra for a wide altitude range even at the ground level. Based on a comprehensive analysis of the simulation results, we propose analytical functions that can predict the cosmic-ray neutron spectra for any location in the atmosphere at altitudes below 20 km, considering the influences of local geometries such as ground and aircraft on the spectra. The accuracy of the analytical functions was well verified by various experimental data.
Visual Analytics for MOOC Data.
Qu, Huamin; Chen, Qing
2015-01-01
With the rise of massive open online courses (MOOCs), tens of millions of learners can now enroll in more than 1,000 courses via MOOC platforms such as Coursera and edX. As a result, a huge amount of data has been collected. Compared with traditional education records, the data from MOOCs has much finer granularity and also contains new pieces of information. It is the first time in history that such comprehensive data related to learning behavior has become available for analysis. What roles can visual analytics play in this MOOC movement? The authors survey the current practice and argue that MOOCs provide an opportunity for visualization researchers and that visual analytics systems for MOOCs can benefit a range of end users such as course instructors, education researchers, students, university administrators, and MOOC providers.
Whiting, S D; Guinea, M L; Fomiatti, K; Flint, M; Limpus, C J
2014-06-14
In recent years, the use of blood chemistry as a diagnostic tool for sea turtles has been demonstrated, but much of its effectiveness relies on reference intervals. The first comprehensive blood chemistry values for healthy wild hawksbill (Eretmochelys imbricata) sea turtles are presented. Nineteen blood chemistry analytes and packed cell volume were analysed for 40 clinically healthy juvenile hawksbill sea turtles captured from a rocky reef habitat in northern Australia. We used four statistical approaches to calculate reference intervals and to investigate their use with non-normal distributions and small sample sizes, and to compare upper and lower limits between methods. Eleven analytes were correlated with curved carapace length indicating that body size should be considered when designing future studies and interpreting analyte values. British Veterinary Association.
Development and Validation of the Career Competencies Indicator (CCI)
ERIC Educational Resources Information Center
Francis-Smythe, Jan; Haase, Sandra; Thomas, Erica; Steele, Catherine
2013-01-01
This article describes the development and validation of the Career Competencies Indicator (CCI); a 43-item measure to assess career competencies (CCs). Following an extensive literature review, a comprehensive item generation process involving consultation with subject matter experts, a pilot study and a factor analytic study on a large sample…
The project addresses an applied ecological question through interdisciplinary and innovative analytical methods. By combining stable isotope and contaminant analyses, this project will provide a comprehensive perspective of how contaminants and diet vary within an organism...
The 209 polychlorinated biphenyl (PCB) congeners and associated nine isomeric groups (nine groups of PCBs with the same degree of chlorination) have been long recorded as high endocrine disrupting chemicals in the environment. Difficult analytical problems exist, in those frequen...
Accountability in Higher Education: A Comprehensive Analytical Framework
ERIC Educational Resources Information Center
Metz, Thaddeus
2011-01-01
Concomitant with the rise of rationalizing accountability in higher education has been an increase in theoretical reflection about the forms accountability has taken and the ones it should take. The literature is now peppered by a wide array of distinctions (e.g. internal/external, inward/outward, vertical/horizontal, upward/downward,…
Teacher Attrition and Retention: A Meta-Analytic and Narrative Review of the Research
ERIC Educational Resources Information Center
Borman, Geoffrey D.; Dowling, N. Maritza
2008-01-01
This comprehensive meta-analysis on teacher career trajectories, consisting of 34 studies of 63 attrition moderators, seeks to understand why teaching attrition occurs, or what factors moderate attrition outcomes. Personal characteristics of teachers are important predictors of turnover. Attributes of teachers' schools, including organizational…
Criminal Justice in America. Second Edition.
ERIC Educational Resources Information Center
Croddy, Marshall; And Others
This comprehensive textbook on criminal justice is intended to serve as the foundation for a high school course on law-related education or as a supplement for civics, government or contemporary-issues courses. Designed to foster critical thinking and analytical skills, the book provides students with an understanding of the criminal justice…
ERIC Educational Resources Information Center
Willis, A. Sandra
Short analytical writing exercises were designed to develop critical thinking and writing skills; stimulate creative thinking and writing; promote learning of psychological concepts; and to assess student knowledge. Design of these assignments was based on Bloom's taxonomy of multiple levels of critical thinking: recall, comprehension,…
DOT National Transportation Integrated Search
2005-05-01
This report synthesized the research findings of Phase I of the Statewide Traffic Safety Study of Louisiana, sponsored by the Louisiana Department of Transportation and Development. The objective of Phase I was to provide a comprehensive review of th...
Brewing as a Comprehensive Learning Platform in Chemical Engineering
ERIC Educational Resources Information Center
Nielsen, Rudi P.; Sørensen, Jens L.; Simonsen, Morten E.; Madsen, Henrik T.; Muff, Jens; Strandgaard, Morten; Søgaard, Erik G.
2016-01-01
Chemical engineering is mostly taught using traditional classroom teaching and laboratory experiments when possible. Being a wide discipline encompassing topics such as analytical chemistry, process design, and microbiology, it may be argued that brewing of beer has many relations to chemical engineering topic-wise. This work illustrates how…
A Scheme for Regrouping WISC-R Subtests.
ERIC Educational Resources Information Center
Groff, Martin G.; Hubble, Larry M.
1984-01-01
Reviews WISC-R factor analytic findings for developing a scheme for regrouping WISC-R subtests, consisting of verbal comprehension and spatial subtests. Subtests comprising these groupings are shown to have more common variance than specific variance and cluster together consistently across the samples of WISC-R scores. (Author/JAC)
Perceived Discrimination and Health: A Meta-Analytic Review
ERIC Educational Resources Information Center
Pascoe, Elizabeth A.; Richman, Laura Smart
2009-01-01
Perceived discrimination has been studied with regard to its impact on several types of health effects. This meta-analysis provides a comprehensive account of the relationships between multiple forms of perceived discrimination and both mental and physical health outcomes. In addition, this meta-analysis examines potential mechanisms by which…
Change in University Governance Structures in Continental Europe
ERIC Educational Resources Information Center
Gornitzka, Åse; Maassen, Peter; de Boer, Harry
2017-01-01
This article discusses changes with respect to university governance structures in six comprehensive universities in Europe. We present an analytical framework on the basis of which we conduct a comparative analysis of the university governance structures along four different dimensions: (a) the internal democratic nature of the governance…
Ban the Book Report: Promoting Frequent and Enthusiastic Reading
ERIC Educational Resources Information Center
Foster, Graham
2012-01-01
Teachers recognize that frequent independent reading increases student knowledge on a wide range of topics, enhances vocabulary, and improves comprehension. "Ban the Book Report" inspires teachers to go beyond narrow and analytical book reports by exploring the potential of book talks, alternate book covers, identifying features of informational…
Education with ICT in South Korea and Chile
ERIC Educational Resources Information Center
Sanchez, Jaime; Salinas, Alvaro; Harris, Jordan
2011-01-01
This article presents a linear-analytical case study on the development of ICT within the educational systems of Chile and South Korea. Through a comprehensive meta-data analysis and bibliographic review, we collected information on both educational systems and their ICT adoption policies. Key differences necessary to understand how both countries…
2012-01-01
Background The global initiative ‘Treatment 2.0’ calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. Methods We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Results Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Conclusions Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the ‘Treatment 2.0’ initiative. PMID:23272730
Fujita, Masami; Poudel, Krishna C; Do, Thi Nhan; Bui, Duc Duong; Nguyen, Van Kinh; Green, Kimberly; Nguyen, Thi Minh Thu; Kato, Masaya; Jacka, David; Cao, Thi Thanh Thuy; Nguyen, Thanh Long; Jimba, Masamine
2012-12-29
The global initiative 'Treatment 2.0' calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the 'Treatment 2.0' initiative.
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech
2012-12-01
To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.
A Comprehensive X-Ray Absorption Model for Atomic Oxygen
NASA Technical Reports Server (NTRS)
Gorczyca, T. W.; Bautista, M. A.; Hasoglu, M. F.; Garcia, J.; Gatuzz, E.; Kaastra, J. S.; Kallman, T. R.; Manson, S. T.; Mendoza, C.; Raassen, A. J. J.;
2013-01-01
An analytical formula is developed to accurately represent the photoabsorption cross section of atomic Oxygen for all energies of interest in X-ray spectral modeling. In the vicinity of the K edge, a Rydberg series expression is used to fit R-matrix results, including important orbital relaxation effects, that accurately predict the absorption oscillator strengths below threshold and merge consistently and continuously to the above-threshold cross section. Further, minor adjustments are made to the threshold energies in order to reliably align the atomic Rydberg resonances after consideration of both experimental and observed line positions. At energies far below or above the K-edge region, the formulation is based on both outer- and inner-shell direct photoionization, including significant shake-up and shake-off processes that result in photoionization-excitation and double-photoionization contributions to the total cross section. The ultimate purpose for developing a definitive model for oxygen absorption is to resolve standing discrepancies between the astronomically observed and laboratory-measured line positions, and between the inferred atomic and molecular oxygen abundances in the interstellar medium from XSTAR and SPEX spectral models.
Thermo-optical Modelling of Laser Matter Interactions in Selective Laser Melting Processes.
NASA Astrophysics Data System (ADS)
Vinnakota, Raj; Genov, Dentcho
Selective laser melting (SLM) is one of the promising advanced manufacturing techniques, which is providing an ideal platform to manufacture components with zero geometric constraints. Coupling the electromagnetic and thermodynamic processes involved in the SLM, and developing the comprehensive theoretical model of the same is of great importance since it can provide significant improvements in the printing processes by revealing the optimal parametric space related to applied laser power, scan velocity, powder material, layer thickness and porosity. Here, we present a self-consistent Thermo-optical model which simultaneously solves the Maxwell's and the heat transfer equations and provides an insight into the electromagnetic energy released in the powder-beds and the concurrent thermodynamics of the particles temperature rise and onset of melting. The numerical calculations are compared with developed analytical model of the SLM process providing insight into the dynamics between laser facilitated Joule heating and radiation mitigated rise in temperature. These results provide guidelines toward improved energy efficiency and optimization of the SLM process scan rates. The current work is funded by the NSF EPSCoR CIMM project under award #OIA-1541079.
Historical review of missile aerodynamic developments
NASA Technical Reports Server (NTRS)
Spearman, M. Leroy
1989-01-01
A comprehensive development history to about 1970 is presented for missile technologies and their associated capabilities and difficulties. Attention is given to the growth of an experimental data base for missile design, as well as to the critical early efforts to develop analytical methods applicable to missiles. Most of the important missile development efforts made during the period from the end of the Second World War to the early 1960s were based primarily on experiences gained through wind tunnel and flight testing; analytical techniques began to demonstrate their usefulness in the design process only in the late 1960s.
Analytical challenges in sports drug testing.
Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas
2018-03-01
Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.
ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...
Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli
Comprehensive Analysis of Two Downburst-Related Aircraft Accidents
NASA Technical Reports Server (NTRS)
Shen, J.; Parks, E. K.; Bach, R. E.
1996-01-01
Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.
WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL
The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...
The use of analytical models in human-computer interface design
NASA Technical Reports Server (NTRS)
Gugerty, Leo
1993-01-01
Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.
Cassidy, Liam; Prasse, Daniela; Linke, Dennis; Schmitz, Ruth A; Tholey, Andreas
2016-10-07
The recent discovery of an increasing number of small open reading frames (sORF) creates the need for suitable analytical technologies for the comprehensive identification of the corresponding gene products. For biological and functional studies the knowledge of the entire set of proteins and sORF gene products is essential. Consequently in the present study we evaluated analytical approaches that will allow for simultaneous analysis of widest parts of the proteome together with the predicted sORF. We performed a full proteome analysis of the methane producing archaeon Methanosarcina mazei strain Gö1 cytosolic proteome using a high/low pH reversed phase LC-MS bottom-up approach. The second analytical approach was based on semi-top-down strategy, encompassing a separation at intact protein level using a GelFree system, followed by digestion and LC-MS analysis. A high overlap in identified proteins was found for both approaches yielding the most comprehensive coverage of the cytosolic proteome of this organism achieved so far. The application of the second approach in combination with an adjustment of the search criteria for database searches further led to a significant increase of sORF peptide identifications, finally allowing to detect and identify 28 sORF gene products.
Validation of the replica trick for simple models
NASA Astrophysics Data System (ADS)
Shinzato, Takashi
2018-04-01
We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.
NASA Astrophysics Data System (ADS)
Mathieson, Haley Aaron
This thesis investigates experimentally and analytically the structural performance of sandwich panels composed of glass fibre reinforced polymer (GFRP) skins and a soft polyurethane foam core, with or without thin GFRP ribs connecting skins. The study includes three main components: (a) out-of-plane bending fatigue, (b) axial compression loading, and (c) in-plane bending of sandwich beams. Fatigue studies included 28 specimens and looked into establishing service life (S-N) curves of sandwich panels without ribs, governed by soft core shear failure and also ribbed panels governed by failure at the rib-skin junction. Additionally, the study compared fatigue life curves of sandwich panels loaded under fully reversed bending conditions (R=-1) with panels cyclically loaded in one direction only (R=0) and established the stiffness degradation characteristics throughout their fatigue life. Mathematical models expressing fatigue life and stiffness degradation curves were calibrated and expanded forms for various loading ratios were developed. Approximate fatigue thresholds of 37% and 23% were determined for non-ribbed panels loaded at R=0 and -1, respectively. Digital imaging techniques showed significant shear contribution significantly (90%) to deflections if no ribs used. Axial loading work included 51 specimens and examined the behavior of panels of various lengths (slenderness ratios), skin thicknesses, and also panels of similar length with various rib configurations. Observed failure modes governing were global buckling, skin wrinkling or skin crushing. In-plane bending involved testing 18 sandwich beams of various shear span-to-depth ratios and skin thicknesses, which failed by skin wrinkling at the compression side. The analytical modeling components of axially loaded panels include; a simple design-oriented analytical failure model and a robust non-linear model capable of predicting the full load-displacement response of axially loaded slender sandwich panels, accounting for P-Delta effects, inherent out-of-straightness profile of any shape at initial conditions, and the excessive shear deformation of soft core and its effect on buckling capacity. Another model was developed to predict the load-deflection response and failure modes of in-plane loaded sandwich beams. After successful verification of the models using experimental results, comprehensive parametric studies were carried out using these models to cover parameters beyond the limitations of the experimental program.
Zheng, Xiasheng; Zhang, Peng; Liao, Baosheng; Li, Jing; Liu, Xingyun; Shi, Yuhua; Cheng, Jinle; Lai, Zhitian; Xu, Jiang; Chen, Shilin
2017-01-01
Herbal medicine is a major component of complementary and alternative medicine, contributing significantly to the health of many people and communities. Quality control of herbal medicine is crucial to ensure that it is safe and sound for use. Here, we investigated a comprehensive quality evaluation system for a classic herbal medicine, Danggui Buxue Formula, by applying genetic-based and analytical chemistry approaches to authenticate and evaluate the quality of its samples. For authenticity, we successfully applied two novel technologies, third-generation sequencing and PCR-DGGE (denaturing gradient gel electrophoresis), to analyze the ingredient composition of the tested samples. For quality evaluation, we used high performance liquid chromatography assays to determine the content of chemical markers to help estimate the dosage relationship between its two raw materials, plant roots of Huangqi and Danggui. A series of surveys were then conducted against several exogenous contaminations, aiming to further access the efficacy and safety of the samples. In conclusion, the quality evaluation system demonstrated here can potentially address the authenticity, quality, and safety of herbal medicines, thus providing novel insight for enhancing their overall quality control. Highlight: We established a comprehensive quality evaluation system for herbal medicine, by combining two genetic-based approaches third-generation sequencing and DGGE (denaturing gradient gel electrophoresis) with analytical chemistry approaches to achieve the authentication and quality connotation of the samples. PMID:28955365
Zheng, Xiasheng; Zhang, Peng; Liao, Baosheng; Li, Jing; Liu, Xingyun; Shi, Yuhua; Cheng, Jinle; Lai, Zhitian; Xu, Jiang; Chen, Shilin
2017-01-01
Herbal medicine is a major component of complementary and alternative medicine, contributing significantly to the health of many people and communities. Quality control of herbal medicine is crucial to ensure that it is safe and sound for use. Here, we investigated a comprehensive quality evaluation system for a classic herbal medicine, Danggui Buxue Formula, by applying genetic-based and analytical chemistry approaches to authenticate and evaluate the quality of its samples. For authenticity, we successfully applied two novel technologies, third-generation sequencing and PCR-DGGE (denaturing gradient gel electrophoresis), to analyze the ingredient composition of the tested samples. For quality evaluation, we used high performance liquid chromatography assays to determine the content of chemical markers to help estimate the dosage relationship between its two raw materials, plant roots of Huangqi and Danggui. A series of surveys were then conducted against several exogenous contaminations, aiming to further access the efficacy and safety of the samples. In conclusion, the quality evaluation system demonstrated here can potentially address the authenticity, quality, and safety of herbal medicines, thus providing novel insight for enhancing their overall quality control. Highlight : We established a comprehensive quality evaluation system for herbal medicine, by combining two genetic-based approaches third-generation sequencing and DGGE (denaturing gradient gel electrophoresis) with analytical chemistry approaches to achieve the authentication and quality connotation of the samples.
Liu, Yan-Chun; Xiao, Sa; Yang, Kun; Ling, Li; Sun, Zhi-Liang; Liu, Zhao-Ying
2017-06-01
This study reports an applicable analytical strategy of comprehensive identification and structure characterization of target components from Gelsemium elegans by using high-performance liquid chromatography quadrupole time-of-flight mass spectrometry (LC-QqTOF MS) based on the use of accurate mass databases combined with MS/MS spectra. The databases created included accurate masses and elemental compositions of 204 components from Gelsemium and their structural data. The accurate MS and MS/MS spectra were acquired through data-dependent auto MS/MS mode followed by an extraction of the potential compounds from the LC-QqTOF MS raw data of the sample. The same was matched using the databases to search for targeted components in the sample. The structures for detected components were tentatively characterized by manually interpreting the accurate MS/MS spectra for the first time. A total of 57 components have been successfully detected and structurally characterized from the crude extracts of G. elegans, but has failed to differentiate some isomers. This analytical strategy is generic and efficient, avoids isolation and purification procedures, enables a comprehensive structure characterization of target components of Gelsemium and would be widely applicable for complicated mixtures that are derived from Gelsemium preparations. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
MetaMetaDB: A Database and Analytic System for Investigating Microbial Habitability
Yang, Ching-chia; Iwasaki, Wataru
2014-01-01
MetaMetaDB (http://mmdb.aori.u-tokyo.ac.jp/) is a database and analytic system for investigating microbial habitability, i.e., how a prokaryotic group can inhabit different environments. The interaction between prokaryotes and the environment is a key issue in microbiology because distinct prokaryotic communities maintain distinct ecosystems. Because 16S ribosomal RNA (rRNA) sequences play pivotal roles in identifying prokaryotic species, a system that comprehensively links diverse environments to 16S rRNA sequences of the inhabitant prokaryotes is necessary for the systematic understanding of the microbial habitability. However, existing databases are biased to culturable prokaryotes and exhibit limitations in the comprehensiveness of the data because most prokaryotes are unculturable. Recently, metagenomic and 16S rRNA amplicon sequencing approaches have generated abundant 16S rRNA sequence data that encompass unculturable prokaryotes across diverse environments; however, these data are usually buried in large databases and are difficult to access. In this study, we developed MetaMetaDB (Meta-Metagenomic DataBase), which comprehensively and compactly covers 16S rRNA sequences retrieved from public datasets. Using MetaMetaDB, users can quickly generate hypotheses regarding the types of environments a prokaryotic group may be adapted to. We anticipate that MetaMetaDB will improve our understanding of the diversity and evolution of prokaryotes. PMID:24475242
Detailed study of oxidation/wear mechanism in lox turbopump bearings
NASA Technical Reports Server (NTRS)
Chase, T. J.; Mccarty, J. P.
1993-01-01
Wear of 440C angular contact ball bearings of the phase 2 high pressure oxygen turbopump (HPOTP) of the space shuttle main engine (SSME) has been studied by means of various advanced nondestructive techniques (NDT) and modeled with reference to all known material, design, and operation variables. Three modes dominating the wear scenario were found to be the adhesive/sheer peeling (ASP), oxidation, and abrasion. Bearing wear was modeled in terms of the three modes. Lacking a comprehensive theory of rolling contact wear to date, each mode is modeled after well-established theories of sliding wear, while sliding velocity and distance are related to microsliding in ball-to-ring contacts. Microsliding, stress, temperature, and other contact variables are evaluated with analytical software packages of SHABERTH(TM)/SINDA(TM) and ADORE(TM). Empirical constants for the models are derived from NIST experiments by applying the models to the NIST wear data. The bearing wear model so established precisely predicts quite well the average ball wear rate for the HPOTP bearings. The wear rate has been statistically determined for the entire population of flight and development bearings based on Rocketdyne records to date. Numerous illustrations are given.
Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.
2016-01-01
Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson’s disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Conclusions Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson’s disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer’s, Huntington’s, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications. PMID:27494614
Little, Callie W; Haughbrook, Rasheda; Hart, Sara A
2016-01-01
Numerous twin studies have been published examining the genetic and environmental etiology of reading comprehension, though the etiological estimates may be influenced currently unidentified sample conditions (e.g., Tucker-Drob & Bates, 2015). The purpose of the current meta-analysis was to average the etiological influences of reading comprehension and to explore the potential moderators that may be influencing these estimates. Results revealed an average heritability estimate of h2 = .59, with significant variation in estimates across studies, suggesting potential moderation. Heritability was moderated by publication year, grade level, project, zygosity determination method, and response type. The average shared environmental estimate was c2 = .16, with publication year, grade and zygosity determination method acting as significant moderators. These findings support the large role of genetic influences on reading comprehension, and a small but significant role of shared environmental influences. The significant moderators of etiological influences within the current synthesis suggest our interpretation of how genes and environment influence reading comprehension should reflect aspects of study and sample. PMID:27630039
An analytic performance model of disk arrays and its application
NASA Technical Reports Server (NTRS)
Lee, Edward K.; Katz, Randy H.
1991-01-01
As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.
Direct Laser Writing of Single-Material Sheets with Programmable Self-Rolling Capability
NASA Astrophysics Data System (ADS)
Bauhofer, Anton; KröDel, Sebastian; Bilal, Osama; Daraio, Chiara; Constantinescu, Andrei
Direct laser writing, a sub-class of two-photon polymerization, facilitates 3D-printing of single-material microstructures with inherent residual stresses. Here we show that controlled distribution of these stresses allows for fast and cost-effective fabrication of structures with programmable self-rolling capability. We investigate 2D sheets that evolve into versatile 3D structures. Precise control over the shape morphing potential is acquired through variations in geometry and writing parameters. Effects of capillary action and gravity were shown to be relevant for very thin sheets (thickness <1.5um) and have been analytically and experimentally quantified. In contrast to that, the deformations of sheets with larger thickness (>1.5um) are dominated by residual stresses and adhesion forces. The presented structures create local tensions up to 180MPa, causing rolling curvatures of 25E3m-1. A comprehensive analytical model that captures the relevant influence factors was developed based on laminate plate theory. The predicted curvature and directionality correspond well with the experimentally obtained data. Potential applications are found in drug encapsulation and particle traps for emulsions with differing surface energies. This work was supported by the Swiss National Science Foundation.
A comprehensive test of clinical reasoning for medical students: An olympiad experience in Iran.
Monajemi, Alireza; Arabshahi, Kamran Soltani; Soltani, Akbar; Arbabi, Farshid; Akbari, Roghieh; Custers, Eugene; Hadadgar, Arash; Hadizadeh, Fatemeh; Changiz, Tahereh; Adibi, Peyman
2012-01-01
Although some tests for clinical reasoning assessment are now available, the theories of medical expertise have not played a major role in this filed. In this paper, illness script theory was chose as a theoretical framework and contemporary clinical reasoning tests were put together based on this theoretical model. This paper is a qualitative study performed with an action research approach. This style of research is performed in a context where authorities focus on promoting their organizations' performance and is carried out in the form of teamwork called participatory research. Results are presented in four parts as basic concepts, clinical reasoning assessment, test framework, and scoring. we concluded that no single test could thoroughly assess clinical reasoning competency, and therefore a battery of clinical reasoning tests is needed. This battery should cover all three parts of clinical reasoning process: script activation, selection and verification. In addition, not only both analytical and non-analytical reasoning, but also both diagnostic and management reasoning should evenly take into consideration in this battery. This paper explains the process of designing and implementing the battery of clinical reasoning in the Olympiad for medical sciences students through an action research.
Capillary Flow in an Interior Corner
NASA Technical Reports Server (NTRS)
Weislogel, Mark Milton
1996-01-01
The design of fluids management processes in the low-gravity environment of space requires an accurate model and description of capillarity-controlled flow in containers of irregular geometry. Here we consider the capillary rise of a fluid along an interior corner of a container following a rapid reduction in gravity. The analytical portion of the work presents an asymptotic formulation in the limit of a slender fluid column, slight surface curvature along the corner, small inertia, and low gravity. New similarity solutions are found and a list of closed form expressions is provided for flow rate and column length. In particular, it is found that the flow is proportional to t(exp 1/2) for a constant height boundary condition, t(exp 2/5) for a spreading drop, and t(exp 3/5) for constant flow. In the experimental portion of the work, measurements from a 2.2s drop tower are reported. An extensive data set, collected over a previously unexplored range of flow parameters, includes estimates of repeatability and accuracy, the role of inertia and column slenderness, and the effects of corner angle, container geometry, and fluid properties. Comprehensive comparisons are made which illustrate the applicability of the analytic results to low-g fluid systems design.
The neurocognitive consequences of sleep restriction: A meta-analytic review.
Lowe, Cassandra J; Safati, Adrian; Hall, Peter A
2017-09-01
The current meta-analytic review evaluated the effects of experimentally manipulated sleep restriction on neurocognitive functioning. Random-effects models were employed to estimate the overall effect size and the differential effect size across cognitive domains. Age, time of day, age-adjusted sleep deficit, cumulative days of restricted sleep, sleep latency, subjective sleepiness, and biological sex were examined as potential moderators of the effect. Based on a sample of 61 studies, from 71 different populations, findings revealed a significant negative effect of sleep restriction on cognitive processing across cognitive domains (g=-0.383, p<0.001). This effect held for executive functioning (g=-0.324, p<0.001), sustained attention (g=-0.409, p<0.001), and long-term memory (g=-0.192, p=0.002). There was insufficient evidence to detect an effect within the domains of attention, multitask, impulsive decision-making or intelligence. Age group, time of day, cumulative days of restricted sleep, sleep latency, subjective sleepiness, and biological sex were all significant moderators of the overall effect. In conclusion, the current meta-analysis is the first comprehensive review to provide evidence that short-term sleep restriction significantly impairs waking neurocognitive functioning. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Yuhang; Zhang, Jianpeng; Xing, Yufeng; Song, Jizhou
2018-05-01
Epidermal electronic devices (EEDs) have similar mechanical properties as those of human skin such that they can be integrated with human skin for potential applications in monitoring of human vital signs for diagnostic, therapeutic or surgical functions. Thermal management is critical for EEDs in these applications since excessive heating may cause discomfort. Comprehensive analytical studies, finite element analysis and experiments are carried out to study the effects of interfacial thermal resistance between EEDs and human skin on thermal properties of the EED/skin system in this paper. The coupling between the Fourier heat transfer in EEDs and the bio-heat transfer in human skin is accounted in the analytical model based on the transfer matrix method to give accurate predictions on temperatures, which agree well with finite element analysis and experimental measurements. It is shown that the maximum temperature increase of the EED for the case of imperfect bonding between EED and skin is much higher than that of perfect bonding. These results may help the design of EEDs in bi-integrated applications and suggest a valuable route to evaluate the bonding condition between EEDs and biological tissues.
Boeing Smart Rotor Full-scale Wind Tunnel Test Data Report
NASA Technical Reports Server (NTRS)
Kottapalli, Sesi; Hagerty, Brandon; Salazar, Denise
2016-01-01
A full-scale helicopter smart material actuated rotor technology (SMART) rotor test was conducted in the USAF National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel at NASA Ames. The SMART rotor system is a five-bladed MD 902 bearingless rotor with active trailing-edge flaps. The flaps are actuated using piezoelectric actuators. Rotor performance, structural loads, and acoustic data were obtained over a wide range of rotor shaft angles of attack, thrust, and airspeeds. The primary test objective was to acquire unique validation data for the high-performance computing analyses developed under the Defense Advanced Research Project Agency (DARPA) Helicopter Quieting Program (HQP). Other research objectives included quantifying the ability of the on-blade flaps to achieve vibration reduction, rotor smoothing, and performance improvements. This data set of rotor performance and structural loads can be used for analytical and experimental comparison studies with other full-scale rotor systems and for analytical validation of computer simulation models. The purpose of this final data report is to document a comprehensive, highquality data set that includes only data points where the flap was actively controlled and each of the five flaps behaved in a similar manner.
Electromembrane extraction--three-phase electrophoresis for future preparative applications.
Gjelstad, Astrid; Pedersen-Bjergaard, Stig
2014-09-01
The purpose of this article is to discuss the principle and the future potential for electromembrane extraction (EME). EME was presented in 2006 as a totally new sample preparation technique for ionized target analytes, based on electrokinetic migration across a supported liquid membrane under the influence of an external electrical field. The principle of EME is presented, and typical performance data for EME are discussed. Most work with EME up to date has been performed with low-molecular weight pharmaceutical substances as model analytes, but the principles of EME should be developed in other directions in the future to fully explore the potential. Recent research in new directions is critically reviewed, with focus on extraction of different types of chemical and biochemical substances, new separation possibilities, new approaches, and challenges related to mass transfer and background current. The intention of this critical review is to give a flavor of EME and to stimulate into more research in the area of EME. Unlike other review articles, the current one is less comprehensive, but put more emphasis on new directions for EME. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Multi-scaling allometric analysis for urban and regional development
NASA Astrophysics Data System (ADS)
Chen, Yanguang
2017-01-01
The concept of allometric growth is based on scaling relations, and it has been applied to urban and regional analysis for a long time. However, most allometric analyses were devoted to the single proportional relation between two elements of a geographical system. Few researches focus on the allometric scaling of multielements. In this paper, a process of multiscaling allometric analysis is developed for the studies on spatio-temporal evolution of complex systems. By means of linear algebra, general system theory, and by analogy with the analytical hierarchy process, the concepts of allometric growth can be integrated with the ideas from fractal dimension. Thus a new methodology of geo-spatial analysis and the related theoretical models emerge. Based on the least squares regression and matrix operations, a simple algorithm is proposed to solve the multiscaling allometric equation. Applying the analytical method of multielement allometry to Chinese cities and regions yields satisfying results. A conclusion is reached that the multiscaling allometric analysis can be employed to make a comprehensive evaluation for the relative levels of urban and regional development, and explain spatial heterogeneity. The notion of multiscaling allometry may enrich the current theory and methodology of spatial analyses of urban and regional evolution.
NASA Astrophysics Data System (ADS)
Chen, J. S.; Chiang, S. Y.; Liang, C. P.
2017-12-01
It is essential to develop multispecies transport analytical models based on a set of advection-dispersion equations (ADEs) coupled with sequential first-order decay reactions for the synchronous prediction of plume migrations of both parent and its daughter species of decaying contaminants such as radionuclides, dissolved chlorinated organic compounds, pesticides and nitrogen. Although several analytical models for multispecies transport have already been reported, those currently available in the literature have primarily been derived based on ADEs with constant dispersion coefficients. However, there have been a number of studies demonstrating that the dispersion coefficients increase with the solute travel distance as a consequence of variation in the hydraulic properties of the porous media. This study presents novel analytical models for multispecies transport with distance-dependent dispersion coefficients. The correctness of the derived analytical models is confirmed by comparing them against the numerical models. Results show perfect agreement between the analytical and numerical models. Comparison of our new analytical model for multispecies transport with scale-dependent dispersion to an analytical model with constant dispersion is made to illustrate the effects of the dispersion coefficients on the multispecies transport of decaying contaminants.
Tsao, C C; Liou, J U; Wen, P H; Peng, C C; Liu, T S
2013-01-01
Aim To develop analytical models and analyse the stress distribution and flexibility of nickel–titanium (NiTi) instruments subject to bending forces. Methodology The analytical method was used to analyse the behaviours of NiTi instruments under bending forces. Two NiTi instruments (RaCe and Mani NRT) with different cross-sections and geometries were considered. Analytical results were derived using Euler–Bernoulli nonlinear differential equations that took into account the screw pitch variation of these NiTi instruments. In addition, the nonlinear deformation analysis based on the analytical model and the finite element nonlinear analysis was carried out. Numerical results are obtained by carrying out a finite element method. Results According to analytical results, the maximum curvature of the instrument occurs near the instrument tip. Results of the finite element analysis revealed that the position of maximum von Mises stress was near the instrument tip. Therefore, the proposed analytical model can be used to predict the position of maximum curvature in the instrument where fracture may occur. Finally, results of analytical and numerical models were compatible. Conclusion The proposed analytical model was validated by numerical results in analysing bending deformation of NiTi instruments. The analytical model is useful in the design and analysis of instruments. The proposed theoretical model is effective in studying the flexibility of NiTi instruments. Compared with the finite element method, the analytical model can deal conveniently and effectively with the subject of bending behaviour of rotary NiTi endodontic instruments. PMID:23173762
New analysis of ηπ tensor resonances measured at the COMPASS experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackura, A.; Fernández-Ramírez, C.; Mikhasenko, M.
We presenmore » t a new amplitude analysis of the ηπ D-wave in the reaction π - p → η π - p measured by COMPASS. Employing an analytical model based on the principles of the relativistic S-matrix, we find two resonances that can be identified with the a 2 ( 1320 ) and the excited a 2 ' ( 1700 ) , and perform a comprehensive analysis of their pole positions. For the mass and width of the a 2 we find M = ( 1307 ± 1 ± 6 ) MeV and Γ = ( 112 ± 1 ± 8 ) MeV, and for the excited state a 2 ' we obtain M = ( 1720 ± 10 ± 60 ) MeV and Γ = ( 280 ± 10 ± 70 ) MeV, respectively.« less
Correlation between tunability and anisotropy in magnetoelectric voltage tunable inductor (VTI).
Yan, Yongke; Geng, Liwei D; Zhang, Lujie; Gao, Xiangyu; Gollapudi, Sreenivasulu; Song, Hyun-Cheol; Dong, Shuxiang; Sanghadasa, Mohan; Ngo, Khai; Wang, Yu U; Priya, Shashank
2017-11-22
Electric field modulation of magnetic properties via magnetoelectric coupling in composite materials is of fundamental and technological importance for realizing tunable energy efficient electronics. Here we provide foundational analysis on magnetoelectric voltage tunable inductor (VTI) that exhibits extremely large inductance tunability of up to 1150% under moderate electric fields. This field dependence of inductance arises from the change of permeability, which correlates with the stress dependence of magnetic anisotropy. Through combination of analytical models that were validated by experimental results, comprehensive understanding of various anisotropies on the tunability of VTI is provided. Results indicate that inclusion of magnetic materials with low magnetocrystalline anisotropy is one of the most effective ways to achieve high VTI tunability. This study opens pathway towards design of tunable circuit components that exhibit field-dependent electronic behavior.
New analysis of ηπ tensor resonances measured at the COMPASS experiment
Jackura, A.; Fernández-Ramírez, C.; Mikhasenko, M.; ...
2018-02-27
We presenmore » t a new amplitude analysis of the ηπ D-wave in the reaction π - p → η π - p measured by COMPASS. Employing an analytical model based on the principles of the relativistic S-matrix, we find two resonances that can be identified with the a 2 ( 1320 ) and the excited a 2 ' ( 1700 ) , and perform a comprehensive analysis of their pole positions. For the mass and width of the a 2 we find M = ( 1307 ± 1 ± 6 ) MeV and Γ = ( 112 ± 1 ± 8 ) MeV, and for the excited state a 2 ' we obtain M = ( 1720 ± 10 ± 60 ) MeV and Γ = ( 280 ± 10 ± 70 ) MeV, respectively.« less
NASA Technical Reports Server (NTRS)
Chen, C. P.
1990-01-01
An existing Computational Fluid Dynamics code for simulating complex turbulent flows inside a liquid rocket combustion chamber was validated and further developed. The Advanced Rocket Injector/Combustor Code (ARICC) is simplified and validated against benchmark flow situations for laminar and turbulent flows. The numerical method used in ARICC Code is re-examined for incompressible flow calculations. For turbulent flows, both the subgrid and the two equation k-epsilon turbulence models are studied. Cases tested include idealized Burger's equation in complex geometries and boundaries, a laminar pipe flow, a high Reynolds number turbulent flow, and a confined coaxial jet with recirculations. The accuracy of the algorithm is examined by comparing the numerical results with the analytical solutions as well as experimented data with different grid sizes.
An ultrasensitive universal detector based on neutralizer displacement
NASA Astrophysics Data System (ADS)
Das, Jagotamoy; Cederquist, Kristin B.; Zaragoza, Alexandre A.; Lee, Paul E.; Sargent, Edward H.; Kelley, Shana O.
2012-08-01
Diagnostic technologies that can provide the simultaneous detection of nucleic acids for gene expression, proteins for host response and small molecules for profiling the human metabolome will have a significant advantage in providing comprehensive patient monitoring. Molecular sensors that report changes in the electrostatics of a sensor's surface on analyte binding have shown unprecedented sensitivity in the detection of charged biomolecules, but do not lend themselves to the detection of small molecules, which do not carry significant charge. Here, we introduce the neutralizer displacement assay that allows charge-based sensing to be applied to any class of molecule irrespective of the analyte charge. The neutralizer displacement assay starts with an aptamer probe bound to a neutralizer. When analyte binding occurs the neutralizer is displaced, which results in a dramatic change in the surface charge for all types of analytes. We have tested the sensitivity, speed and specificity of this system in the detection of a panel of molecules: (deoxy)ribonucleic acid, ribonucleic acid, cocaine, adenosine triphosphate and thrombin.
Roca, M; Leon, N; Pastor, A; Yusà, V
2014-12-29
In this study we propose an analytical strategy that combines a target approach for the quantitative analysis of contemporary pesticide metabolites with a comprehensive post-target screening for the identification of biomarkers of exposure to environmental contaminants in urine using liquid chromatography coupled to high-resolution mass spectrometry (LC–HRMS). The quantitative method for the target analysis of 29 urinary metabolites of organophosphate (OP) insecticides, synthetic pyrethroids, herbicides and fungicides was validated after a previous statistical optimization of the main factors governing the ion source ionization and a fragmentation study using the high energy collision dissociation (HCD) cell. The full scan accurate mass data were acquired with a resolving power of 50,000 FWHM (scan speed, 2 Hz), in both ESI+ and ESI− modes, and with and without HCD-fragmentation. The method – LOQ was lower than 3.2 μg L−1 for the majority of the analytes. For post-target screening a customized theoretical database was built, for the identification of 60 metabolites including pesticides, PAHs, phenols, and other metabolites of environmental pollutants. For identification purposes, accurate exact mass with less than 5 ppm, and diagnostic ions including isotopes and/or fragments were used. The analytical strategy was applied to 20 urine sample collected from children living in Valencia Region. Eleven target metabolites were detected with concentrations ranging from 1.18 to 131 μg L−1. Likewise, several compounds were tentatively identified in the post-target analysis belonging to the families of phthalates, phenols and parabenes. The proposed strategy is suitable for the determination of target pesticide biomarkers in urine in the framework of biomonitoring studies, and appropriate for the identification of other non-target metabolites.
Zill, Oliver A.; Sebisanovic, Dragan; Lopez, Rene; Blau, Sibel; Collisson, Eric A.; Divers, Stephen G.; Hoon, Dave S. B.; Kopetz, E. Scott; Lee, Jeeyun; Nikolinakos, Petros G.; Baca, Arthur M.; Kermani, Bahram G.; Eltoukhy, Helmy; Talasaz, AmirAli
2015-01-01
Next-generation sequencing of cell-free circulating solid tumor DNA addresses two challenges in contemporary cancer care. First this method of massively parallel and deep sequencing enables assessment of a comprehensive panel of genomic targets from a single sample, and second, it obviates the need for repeat invasive tissue biopsies. Digital SequencingTM is a novel method for high-quality sequencing of circulating tumor DNA simultaneously across a comprehensive panel of over 50 cancer-related genes with a simple blood test. Here we report the analytic and clinical validation of the gene panel. Analytic sensitivity down to 0.1% mutant allele fraction is demonstrated via serial dilution studies of known samples. Near-perfect analytic specificity (> 99.9999%) enables complete coverage of many genes without the false positives typically seen with traditional sequencing assays at mutant allele frequencies or fractions below 5%. We compared digital sequencing of plasma-derived cell-free DNA to tissue-based sequencing on 165 consecutive matched samples from five outside centers in patients with stage III-IV solid tumor cancers. Clinical sensitivity of plasma-derived NGS was 85.0%, comparable to 80.7% sensitivity for tissue. The assay success rate on 1,000 consecutive samples in clinical practice was 99.8%. Digital sequencing of plasma-derived DNA is indicated in advanced cancer patients to prevent repeated invasive biopsies when the initial biopsy is inadequate, unobtainable for genomic testing, or uninformative, or when the patient’s cancer has progressed despite treatment. Its clinical utility is derived from reduction in the costs, complications and delays associated with invasive tissue biopsies for genomic testing. PMID:26474073
Simulating Activities: Relating Motives, Deliberation and Attentive Coordination
NASA Technical Reports Server (NTRS)
Clancey, William J.; Clancy, Daniel (Technical Monitor)
2002-01-01
Activities are located behaviors, taking time, conceived as socially meaningful, and usually involving interaction with tools and the environment. In modeling human cognition as a form of problem solving (goal-directed search and operator sequencing), cognitive science researchers have not adequately studied "off-task" activities (e.g., waiting), non-intellectual motives (e.g., hunger), sustaining a goal state (e.g., playful interaction), and coupled perceptual-motor dynamics (e.g., following someone). These aspects of human behavior have been considered in bits and pieces in past research, identified as scripts, human factors, behavior settings, ensemble, flow experience, and situated action. More broadly, activity theory provides a comprehensive framework relating motives, goals, and operations. This paper ties these ideas together, using examples from work life in a Canadian High Arctic research station. The emphasis is on simulating human behavior as it naturally occurs, such that "working" is understood as an aspect of living. The result is a synthesis of previously unrelated analytic perspectives and a broader appreciation of the nature of human cognition. Simulating activities in this comprehensive way is useful for understanding work practice, promoting learning, and designing better tools, including human-robot systems.
NASA Astrophysics Data System (ADS)
Ivanova, Bojidarka B.; Spiteller, Michael
2012-09-01
A comprehensive screening of fifteen functionalized Ergot-alkaloids, containing bulk aliphatic cyclic substituents at D-ring of the ergoline molecular skeleton was performed, studying their structure-active relationships and model interactions with α2A-adreno-, serotonin (5HT2A) and dopamine D3 (D3A) receptors. The accounted high affinity to the receptors binding loops and unusual bonding situations, joined with the molecular flexibility of the substituents and the presence of proton accepting/donating functional groups in the studied alkaloids, may contribute to further understanding the mechanisms of biological activity in vivo and in predicting their therapeutic potential in central nervous system (CNS), including those related the Schizophrenia. Since the presented correlation between the molecular structure and properties, was based on the comprehensively theoretical computational and experimental physical study on the successfully isolated derivatives, through using routine synthetic pathways in a relatively high yields, marked these derivatives as 'treasure' for further experimental and theoretical studied in areas such as: (a) pharmacological and clinical testing; (b) molecular-drugs design of novel psychoactive substances; (c) development of the analytical protocols for determination of Ergot-alkaloids through a functionalization of the ergoline-skeleton, and more.
An analytical method for free vibration analysis of functionally graded beams with edge cracks
NASA Astrophysics Data System (ADS)
Wei, Dong; Liu, Yinghua; Xiang, Zhihai
2012-03-01
In this paper, an analytical method is proposed for solving the free vibration of cracked functionally graded material (FGM) beams with axial loading, rotary inertia and shear deformation. The governing differential equations of motion for an FGM beam are established and the corresponding solutions are found first. The discontinuity of rotation caused by the cracks is simulated by means of the rotational spring model. Based on the transfer matrix method, then the recurrence formula is developed to get the eigenvalue equations of free vibration of FGM beams. The main advantage of the proposed method is that the eigenvalue equation for vibrating beams with an arbitrary number of cracks can be conveniently determined from a third-order determinant. Due to the decrease in the determinant order as compared with previous methods, the developed method is simpler and more convenient to analytically solve the free vibration problem of cracked FGM beams. Moreover, free vibration analyses of the Euler-Bernoulli and Timoshenko beams with any number of cracks can be conducted using the unified procedure based on the developed method. These advantages of the proposed procedure would be more remarkable as the increase of the number of cracks. A comprehensive analysis is conducted to investigate the influences of the location and total number of cracks, material properties, axial load, inertia and end supports on the natural frequencies and vibration mode shapes of FGM beams. The present work may be useful for the design and control of damaged structures.
Experimental identification and analytical modelling of human walking forces: Literature review
NASA Astrophysics Data System (ADS)
Racic, V.; Pavic, A.; Brownjohn, J. M. W.
2009-09-01
Dynamic forces induced by humans walking change simultaneously in time and space, being random in nature and varying considerably not only between different people but also for a single individual who cannot repeat two identical steps. Since these important aspects of walking forces have not been adequately researched in the past, the corresponding lack of knowledge has reflected badly on the quality of their mathematical models used in vibration assessments of pedestrian structures such as footbridges, staircases and floors. To develop better force models which can be used with more confidence in the structural design, an adequate experimental and analytical approach must be taken to account for their complexity. This paper is the most comprehensive review published to date, of 270 references dealing with different experimental and analytical characterizations of human walking loading. The source of dynamic human-induced forces is in fact in the body motion. To date, human motion has attracted a lot of interest in many scientific branches, particularly in medical and sports science, bioengineering, robotics, and space flight programs. Other fields include biologists of various kinds, physiologists, anthropologists, computer scientists (graphics and animation), human factors and ergonomists, etc. It resulted in technologically advanced tools that can help understanding the human movement in more detail. Therefore, in addition to traditional direct force measurements utilizing a force plate and an instrumented treadmill, this review also introduces methods for indirect measurement of time-varying records of walking forces via combination of visual motion tracking (imaging) data and known body mass distribution. The review is therefore an interdisciplinary article that bridges the gaps between biomechanics of human gait and civil engineering dynamics. Finally, the key reason for undertaking this review is the fact that human-structure dynamic interaction and pedestrian synchronization when walking on more or less perceptibly moving structures are increasingly giving serious cause for concern in vibration serviceability design. There is a considerable uncertainty about how excessive structural vibrations modify walking and hence affect pedestrian-induced forces, significantly in many cases. Modelling of this delicate mechanism is one of the challenges that the international civil structural engineering community face nowadays and this review thus provides a step toward understanding better the problem.
A License to Lead? A New Leadership Agenda for America's Schools.
ERIC Educational Resources Information Center
Hess, Frederick M.
While considerable attention is paid to training and development for teachers, not enough is paid to the training of educational leaders, this paper contends. The aim of this paper is to provide an analytical groundwork and comprehensive direction for reform of educational leadership training to help policymakers design specific solutions to…
Research in Special Education: Designs, Methods, and Applications. Second Edition
ERIC Educational Resources Information Center
Rumrill, Phillip D., Jr.; Cook, Bryan G.; Wiley, Andrew L.
2011-01-01
The goal of this second edition is to provide a comprehensive overview of the philosophical, ethical, methodological, and analytical fundamentals of social science and educational research, as well as specify aspects of special education research that distinguish it from scientific inquiry in other fields of education and human services. Foremost…
ERIC Educational Resources Information Center
Hadjioannou, Xenia; Hutchinson, Mary
2014-01-01
Research has extolled the potential of transmediation in expanding learners' analytical and critical insight. However, this approach requires teachers prepared to employ this multimodal way of knowing. This study examines the impact of transmediation course experiences on pre-service teachers' comprehension of and critical engagement with…
What's Going on in This Picture? Visual Thinking Strategies and Adult Learning
ERIC Educational Resources Information Center
Landorf, Hilary
2006-01-01
The Visual Thinking Strategies (VTS) curriculum and teaching method uses art to help students think critically, listen attentively, communicate, and collaborate. VTS has been proven to enhance reading, writing, comprehension, and creative and analytical skills among students of all ages. The origins and procedures of the VTS curriculum are…
ERIC Educational Resources Information Center
Camara, Boubacar
This publication complements the "Education for All" program and is intended to provide a comprehensive and operational indicator for monitoring education. As a synthetic tool, the Educational Progress Indicator (EPI) facilitates the analytical assessment and projection work of educational planners, managers, actors, and policymakers. The EPI…
Addressing Misconceptions in Geometry through Written Error Analyses
ERIC Educational Resources Information Center
Kembitzky, Kimberle A.
2009-01-01
This study examined the improvement of students' comprehension of geometric concepts through analytical writing about their own misconceptions using a reflective tool called an ERNIe (acronym for ERror aNalyIsis). The purpose of this study was to determine whether the ERNIe process could be used to correct geometric misconceptions, as well as how…
Many EPA programs, including those under the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Response, Compensation, and Liability Act (CERCLA), require subsurface characterization and monitoring to detect ground-water contamination and provide data to deve...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cutler, Dylan; Frank, Stephen; Slovensky, Michelle
Rich, well-organized building performance and energy consumption data enable a host of analytic capabilities for building owners and operators, from basic energy benchmarking to detailed fault detection and system optimization. Unfortunately, data integration for building control systems is challenging and costly in any setting. Large portfolios of buildings--campuses, cities, and corporate portfolios--experience these integration challenges most acutely. These large portfolios often have a wide array of control systems, including multiple vendors and nonstandard communication protocols. They typically have complex information technology (IT) networks and cybersecurity requirements and may integrate distributed energy resources into their infrastructure. Although the challenges are significant,more » the integration of control system data has the potential to provide proportionally greater value for these organizations through portfolio-scale analytics, comprehensive demand management, and asset performance visibility. As a large research campus, the National Renewable Energy Laboratory (NREL) experiences significant data integration challenges. To meet them, NREL has developed an architecture for effective data collection, integration, and analysis, providing a comprehensive view of data integration based on functional layers. The architecture is being evaluated on the NREL campus through deployment of three pilot implementations.« less
Li, Dong-tao; Ling, Chang-quan; Zhu, De-zeng
2007-07-01
To establish a quantitative model for evaluating the degree of the TCM basic syndromes often encountered in patients with primary liver cancer (PLC). Medical literatures concerning the clinical investigation and TCM syndrome of PLC were collected and analyzed adopting expert-composed symposium method, and the 100 millimeter scaling was applied in combining with scoring on degree of symptoms to establish a quantitative criterion for symptoms and signs degree classification in patients with PLC. Two models, i.e. the additive model and the additive-multiplicative model, were established by using comprehensive analytic hierarchy process (AHP) as the mathematical tool to estimate the weight of the criterion for evaluating basic syndromes in various layers by specialists. Then the two models were verified in clinical practice and the outcomes were compared with that fuzzy evaluated by specialists. Verification on 459 times/case of PLC showed that the coincidence rate between the outcomes derived from specialists with that from the additive model was 84.53 %, and with that from the additive-multificative model was 62.75 %, the difference between the two showed statistical significance (P<0.01). It could be decided that the additive model is the principle model suitable for quantitative evaluation on the degree of TCM basic syndromes in patients with PLC.
Airloads and Wake Geometry Calculations for an Isolated Tiltrotor Model in a Wind Tunnel
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2003-01-01
Th tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. For many years such correlation has been performed for helicopter rotors (rotors designed for edgewise flight), but correlation activities for tiltrotors have been limited, in part by the absence of appropriate measured data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single, U4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) now provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will present calculations of airloads, wake geometry, and performance, including correlation with TRAM DNW measurements. The calculations were obtained using CAMRAD II, which is a modern rotorcraft comprehensive analysis, with advanced models intended for application to tiltrotor aircraft as well as helicopters. Comprehensive analyses have received extensive correlation with performance and loads measurements on helicopter rotors. The proposed paper is part of an initial effort to perform an equally extensive correlation with tiltrotor data. The correlation will establish the level of predictive capability achievable with current technology; identify the limitations of the current aerodynamic, wake, and structural models of tiltrotors; and lead to recommendations for research to extend tiltrotor aeromechanics analysis capability. The purpose of the Tilt Rotor Aeroacoustic Model (TRAM) experimental project is to provide data necessary to validate tiltrotor performance and aeroacoustic prediction methodologies and to investigate and demonstrate advanced civil tiltrotor technologies. The TRAM project is a key part of the NASA Short Haul Civil Tiltrotor (SHCT) project. The SHCT project is an element of the Aviation Systems Capacity Initiative within NASA. In April-May 1998 the TRAM was tested in the isolated rotor configuration at the Large Low-speed Facility of the German-Dutch Wind Tunnels (DNW). A preparatory test was conducted in December 1997. These tests were the first comprehensive aeroacoustic test for a tiltrotor, including not only noise and performance data, but airload and wake measurements as well. The TRAM can also be tested in a fill-span configuration, incorporating both rotors Lnd a fuselage model. The wind tunnel installation of the TRAM isolated rotor is shown. The rotor tested in the DNW was a 1/4-scale (9.5 ft diameter) model of the right-hand V-22 proprotor. The rotor and nacelle assembly was attached to an acoustically-treated, isolated rotor test stand through a mechanical pivot (the nacelle conversion axis). The TRAM was analyzed using the rotorcraft comprehensive analysis CAMRAD II. CAMRAD II is an aeromechanical analysis of helicopters and rotorcraft that incorporates a combination of advanced technologies, including multibody dynamics, nonlinear finite elements, and rotorcraft aerodynamics. The trim task finds the equilibrium solution (constant or periodic) for a steady state operating condition, in this case a rotor operating in a wind tunnel. For wind tunnel operation, the thrust and flapping are trimmed to target values. The aerodynamic model includes a wake analysis to calculate the rotor nonuniform induced-velocities, using a free wake geometry. The paper will present the results of CAMRAD II calculations compared to the TRAM DNW measurements for hover performance, helicopter mode performance, and helicopter mode airloads. An example of the hover performance results, comparing both mearements and calculations for the JVX (large scale) and TRAM (small scale) rotors, is shown. An example of the helicopter mode performance, showing the influence of the aerodynamic model (particularly the stall delay model) on the calculated power, induced power, and profile power is also shown. An example of the helicopter mode airloads, showing the influence of various wake and aerodynamic models on the calculations, is shown. Good correlation with measured airloads is obtained using the multiple-trailer wake model. The paper will present additional results, and describe and discuss the aerodynamic behavior in detail.
Collaborative en-route and slot allocation algorithm based on fuzzy comprehensive evaluation
NASA Astrophysics Data System (ADS)
Yang, Shangwen; Guo, Baohua; Xiao, Xuefei; Gao, Haichao
2018-01-01
To allocate the en-routes and slots to the flights with collaborative decision making, a collaborative en-route and slot allocation algorithm based on fuzzy comprehensive evaluation was proposed. Evaluation indexes include flight delay costs, delay time and the number of turning points. Analytic hierarchy process is applied to determining index weights. Remark set for current two flights not yet obtained the en-route and slot in flight schedule is established. Then, fuzzy comprehensive evaluation is performed, and the en-route and slot for the current two flights are determined. Continue selecting the flight not yet obtained an en-route and a slot in flight schedule. Perform fuzzy comprehensive evaluation until all flights have obtained the en-routes and slots. MatlabR2007b was applied to numerical test based on the simulated data of a civil en-route. Test results show that, compared with the traditional strategy of first come first service, the algorithm gains better effect. The effectiveness of the algorithm was verified.
A genetic algorithm-based job scheduling model for big data analytics.
Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei
Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.
The use of analytical models in human-computer interface design
NASA Technical Reports Server (NTRS)
Gugerty, Leo
1991-01-01
Some of the many analytical models in human-computer interface design that are currently being developed are described. The usefulness of analytical models for human-computer interface design is evaluated. Can the use of analytical models be recommended to interface designers? The answer, based on the empirical research summarized here, is: not at this time. There are too many unanswered questions concerning the validity of models and their ability to meet the practical needs of design organizations.
Zhou, Ronggang; Chan, Alan H. S.
2016-01-01
BACKGROUND: In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. OBJECTIVE: This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. METHODS: With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. RESULTS AND CONCLUSIONS: Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process. PMID:28035943
Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.
2017-01-01
Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972
NASA Astrophysics Data System (ADS)
Tamilarasan, Ilavarasan; Saminathan, Brindha; Murugappan, Meenakshi
2016-04-01
The past decade has seen the phenomenal usage of orthogonal frequency division multiplexing (OFDM) in the wired as well as wireless communication domains, and it is also proposed in the literature as a future proof technique for the implementation of flexible resource allocation in cognitive optical networks. Fiber impairment assessment and adaptive compensation becomes critical in such implementations. A comprehensive analytical model for impairments in OFDM-based fiber links is developed. The proposed model includes the combined impact of laser phase fluctuations, fiber dispersion, self phase modulation, cross phase modulation, four-wave mixing, the nonlinear phase noise due to the interaction of amplified spontaneous emission with fiber nonlinearities, and the photodetector noises. The bit error rate expression for the proposed model is derived based on error vector magnitude estimation. The performance analysis of the proposed model is presented and compared for dispersion compensated and uncompensated backbone/backhaul links. The results suggest that OFDM would perform better for uncompensated links than the compensated links due to the negligible FWM effects and there is a need for flexible compensation. The proposed model can be employed in cognitive optical networks for accurate assessment of fiber-related impairments.
Numerical model for the thermal behavior of thermocline storage tanks
NASA Astrophysics Data System (ADS)
Ehtiwesh, Ismael A. S.; Sousa, Antonio C. M.
2018-03-01
Energy storage is a critical factor in the advancement of solar thermal power systems for the sustained delivery of electricity. In addition, the incorporation of thermal energy storage into the operation of concentrated solar power systems (CSPs) offers the potential of delivering electricity without fossil-fuel backup even during peak demand, independent of weather conditions and daylight. Despite this potential, some areas of the design and performance of thermocline systems still require further attention for future incorporation in commercial CSPs, particularly, their operation and control. Therefore, the present study aims to develop a simple but efficient numerical model to allow the comprehensive analysis of thermocline storage systems aiming better understanding of their dynamic temperature response. The validation results, despite the simplifying assumptions of the numerical model, agree well with the experiments for the time evolution of the thermocline region. Three different cases are considered to test the versatility of the numerical model; for the particular type of a storage tank with top round impingement inlet, a simple analytical model was developed to take into consideration the increased turbulence level in the mixing region. The numerical predictions for the three cases are in general good agreement against the experimental results.
Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015
2012-01-01
Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561
NASA Technical Reports Server (NTRS)
Oglebay, J. C.
1977-01-01
A thermal analytic model for a 30-cm engineering model mercury-ion thruster was developed and calibrated using the experimental test results of tests of a pre-engineering model 30-cm thruster. A series of tests, performed later, simulated a wide range of thermal environments on an operating 30-cm engineering model thruster, which was instrumented to measure the temperature distribution within it. The modified analytic model is described and analytic and experimental results compared for various operating conditions. Based on the comparisons, it is concluded that the analytic model can be used as a preliminary design tool to predict thruster steady-state temperature distributions for stage and mission studies and to define the thermal interface bewteen the thruster and other elements of a spacecraft.
Comprehensive stroke units: a review of comparative evidence and experience.
Chan, Daniel K Y; Cordato, Dennis; O'Rourke, Fintan; Chan, Daniel L; Pollack, Michael; Middleton, Sandy; Levi, Chris
2013-06-01
Stroke unit care offers significant benefits in survival and dependency when compared to general medical ward. Most stroke units are either acute or rehabilitation, but comprehensive (combined acute and rehabilitation) model (comprehensive stroke unit) is less common. To examine different levels of evidence of comprehensive stroke unit compared to other organized inpatient stroke care and share local experience of comprehensive stroke units. Cochrane Library and Medline (1980 to December 2010) review of English language articles comparing stroke units to alternative forms of stroke care delivery, different types of stroke unit models, and differences in processes of care within different stroke unit models. Different levels of comparative evidence of comprehensive stroke units to other models of stroke units are collected. There are no randomized controlled trials directly comparing comprehensive stroke units to other stroke unit models (either acute or rehabilitation). Comprehensive stroke units are associated with reduced length of stay and greatest reduction in combined death and dependency in a meta-analysis study when compared to other stroke unit models. Comprehensive stroke units also have better length of stay and functional outcome when compared to acute or rehabilitation stroke unit models in a cross-sectional study, and better length of stay in a 'before-and-after' comparative study. Components of stroke unit care that improve outcome are multifactorial and most probably include early mobilization. A comprehensive stroke unit model has been successfully implemented in metropolitan and rural hospital settings. Comprehensive stroke units are associated with reductions in length of stay and combined death and dependency and improved functional outcomes compared to other stroke unit models. A comprehensive stroke unit model is worth considering as the preferred model of stroke unit care in the planning and delivery of metropolitan and rural stroke services. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
Kim, Unyong; Oh, Myung Jin; Seo, Youngsuk; Jeon, Yinae; Eom, Joon-Ho; An, Hyun Joo
2017-09-01
Glycosylation of recombinant human erythropoietins (rhEPOs) is significantly associated with drug's quality and potency. Thus, comprehensive characterization of glycosylation is vital to assess the biotherapeutic quality and establish the equivalency of biosimilar rhEPOs. However, current glycan analysis mainly focuses on the N-glycans due to the absence of analytical tools to liberate O-glycans with high sensitivity. We developed selective and sensitive method to profile native O-glycans on rhEPOs. O-glycosylation on rhEPO including O-acetylation on a sialic acid was comprehensively characterized. Details such as O-glycan structure and O-acetyl-modification site were obtained from tandem MS. This method may be applied to QC and batch analysis of not only rhEPOs but also other biotherapeutics bearing multiple O-glycosylations.