Sample records for model construction time

  1. An innovative time-cost-quality tradeoff modeling of building construction project based on resource allocation.

    PubMed

    Hu, Wenfa; He, Xinhua

    2014-01-01

    The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated.

  2. An Innovative Time-Cost-Quality Tradeoff Modeling of Building Construction Project Based on Resource Allocation

    PubMed Central

    2014-01-01

    The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated. PMID:24672351

  3. Learning Science by Constructing Models: Can Dragoon Increase Learning without Increasing the Time Required?

    ERIC Educational Resources Information Center

    VanLehn, Kurt; Chung, Greg; Grover, Sachin; Madni, Ayesha; Wetzel, Jon

    2016-01-01

    A common hypothesis is that students will more deeply understand dynamic systems and other complex phenomena if they construct computational models of them. Attempts to demonstrate the advantages of model construction have been stymied by the long time required for students to acquire skill in model construction. In order to make model…

  4. Transtheoretical Model Constructs for Physical Activity Behavior are Invariant across Time among Ethnically Diverse Adults in Hawaii

    PubMed Central

    Nigg, Claudio R; Motl, Robert W; Horwath, Caroline; Dishman, Rod K

    2012-01-01

    Objectives Physical activity (PA) research applying the Transtheoretical Model (TTM) to examine group differences and/or change over time requires preliminary evidence of factorial validity and invariance. The current study examined the factorial validity and longitudinal invariance of TTM constructs recently revised for PA. Method Participants from an ethnically diverse sample in Hawaii (N=700) completed questionnaires capturing each TTM construct. Results Factorial validity was confirmed for each construct using confirmatory factor analysis with full-information maximum likelihood. Longitudinal invariance was evidenced across a shorter (3-month) and longer (6-month) time period via nested model comparisons. Conclusions The questionnaires for each validated TTM construct are provided, and can now be generalized across similar subgroups and time points. Further validation of the provided measures is suggested in additional populations and across extended time points. PMID:22778669

  5. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  6. Construction schedule simulation of a diversion tunnel based on the optimized ventilation time.

    PubMed

    Wang, Xiaoling; Liu, Xuepeng; Sun, Yuefeng; An, Juan; Zhang, Jing; Chen, Hongchao

    2009-06-15

    Former studies, the methods for estimating the ventilation time are all empirical in construction schedule simulation. However, in many real cases of construction schedule, the many factors have impact on the ventilation time. Therefore, in this paper the 3D unsteady quasi-single phase models are proposed to optimize the ventilation time with different tunneling lengths. The effect of buoyancy is considered in the momentum equation of the CO transport model, while the effects of inter-phase drag, lift force, and virtual mass force are taken into account in the momentum source of the dust transport model. The prediction by the present model for airflow in a diversion tunnel is confirmed by the experimental values reported by Nakayama [Nakayama, In-situ measurement and simulation by CFD of methane gas distribution at a heading faces, Shigen-to-Sozai 114 (11) (1998) 769-775]. The construction ventilation of the diversion tunnel of XinTangfang power station in China is used as a case. The distributions of airflow, CO and dust in the diversion tunnel are analyzed. A theory method for GIS-based dynamic visual simulation for the construction processes of underground structure groups is presented that combines cyclic operation network simulation, system simulation, network plan optimization, and GIS-based construction processes' 3D visualization. Based on the ventilation time the construction schedule of the diversion tunnel is simulated by the above theory method.

  7. Escaping the snare of chronological growth and launching a free curve alternative: general deviance as latent growth model.

    PubMed

    Wood, Phillip Karl; Jackson, Kristina M

    2013-08-01

    Researchers studying longitudinal relationships among multiple problem behaviors sometimes characterize autoregressive relationships across constructs as indicating "protective" or "launch" factors or as "developmental snares." These terms are used to indicate that initial or intermediary states of one problem behavior subsequently inhibit or promote some other problem behavior. Such models are contrasted with models of "general deviance" over time in which all problem behaviors are viewed as indicators of a common linear trajectory. When fit of the "general deviance" model is poor and fit of one or more autoregressive models is good, this is taken as support for the inhibitory or enhancing effect of one construct on another. In this paper, we argue that researchers consider competing models of growth before comparing deviance and time-bound models. Specifically, we propose use of the free curve slope intercept (FCSI) growth model (Meredith & Tisak, 1990) as a general model to typify change in a construct over time. The FCSI model includes, as nested special cases, several statistical models often used for prospective data, such as linear slope intercept models, repeated measures multivariate analysis of variance, various one-factor models, and hierarchical linear models. When considering models involving multiple constructs, we argue the construct of "general deviance" can be expressed as a single-trait multimethod model, permitting a characterization of the deviance construct over time without requiring restrictive assumptions about the form of growth over time. As an example, prospective assessments of problem behaviors from the Dunedin Multidisciplinary Health and Development Study (Silva & Stanton, 1996) are considered and contrasted with earlier analyses of Hussong, Curran, Moffitt, and Caspi (2008), which supported launch and snare hypotheses. For antisocial behavior, the FCSI model fit better than other models, including the linear chronometric growth curve model used by Hussong et al. For models including multiple constructs, a general deviance model involving a single trait and multimethod factors (or a corresponding hierarchical factor model) fit the data better than either the "snares" alternatives or the general deviance model previously considered by Hussong et al. Taken together, the analyses support the view that linkages and turning points cannot be contrasted with general deviance models absent additional experimental intervention or control.

  8. Escaping the snare of chronological growth and launching a free curve alternative: General deviance as latent growth model

    PubMed Central

    WOOD, PHILLIP KARL; JACKSON, KRISTINA M.

    2014-01-01

    Researchers studying longitudinal relationships among multiple problem behaviors sometimes characterize autoregressive relationships across constructs as indicating “protective” or “launch” factors or as “developmental snares.” These terms are used to indicate that initial or intermediary states of one problem behavior subsequently inhibit or promote some other problem behavior. Such models are contrasted with models of “general deviance” over time in which all problem behaviors are viewed as indicators of a common linear trajectory. When fit of the “general deviance” model is poor and fit of one or more autoregressive models is good, this is taken as support for the inhibitory or enhancing effect of one construct on another. In this paper, we argue that researchers consider competing models of growth before comparing deviance and time-bound models. Specifically, we propose use of the free curve slope intercept (FCSI) growth model (Meredith & Tisak, 1990) as a general model to typify change in a construct over time. The FCSI model includes, as nested special cases, several statistical models often used for prospective data, such as linear slope intercept models, repeated measures multivariate analysis of variance, various one-factor models, and hierarchical linear models. When considering models involving multiple constructs, we argue the construct of “general deviance” can be expressed as a single-trait multimethod model, permitting a characterization of the deviance construct over time without requiring restrictive assumptions about the form of growth over time. As an example, prospective assessments of problem behaviors from the Dunedin Multidisciplinary Health and Development Study (Silva & Stanton, 1996) are considered and contrasted with earlier analyses of Hussong, Curran, Moffitt, and Caspi (2008), which supported launch and snare hypotheses. For antisocial behavior, the FCSI model fit better than other models, including the linear chronometric growth curve model used by Hussong et al. For models including multiple constructs, a general deviance model involving a single trait and multimethod factors (or a corresponding hierarchical factor model) fit the data better than either the “snares” alternatives or the general deviance model previously considered by Hussong et al. Taken together, the analyses support the view that linkages and turning points cannot be contrasted with general deviance models absent additional experimental intervention or control. PMID:23880389

  9. Design and fabrication of the NASA HL-20 full scale research model

    NASA Technical Reports Server (NTRS)

    Driver, K. Dean; Vess, Robert J.

    1991-01-01

    A full-scale engineering model of the HL-20 Personnel Launch System (PLS) was constructed for systems and human factors evaluation. Construction techniques were developed to enable the vehicle to be constructed with a minimum of time and cost. The design and construction of the vehicle are described.

  10. Infrasound Predictions Using the Weather Research and Forecasting Model: Atmospheric Green's Functions for the Source Physics Experiments 1-6.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poppeliers, Christian; Aur, Katherine Anderson; Preston, Leiph

    This report shows the results of constructing predictive atmospheric models for the Source Physics Experiments 1-6. Historic atmospheric data are combined with topography to construct an atmo- spheric model that corresponds to the predicted (or actual) time of a given SPE event. The models are ultimately used to construct atmospheric Green's functions to be used for subsequent analysis. We present three atmospheric models for each SPE event: an average model based on ten one- hour snap shots of the atmosphere and two extrema models corresponding to the warmest, coolest, windiest, etc. atmospheric snap shots. The atmospheric snap shots consist ofmore » wind, temperature, and pressure profiles of the atmosphere for a one-hour time window centered at the time of the predicted SPE event, as well as nine additional snap shots for each of the nine preceding years, centered at the time and day of the SPE event.« less

  11. Exactly soluble local bosonic cocycle models, statistical transmutation, and simplest time-reversal symmetric topological orders in 3+1 dimensions

    NASA Astrophysics Data System (ADS)

    Wen, Xiao-Gang

    2017-05-01

    We propose a generic construction of exactly soluble local bosonic models that realize various topological orders with gappable boundaries. In particular, we construct an exactly soluble bosonic model that realizes a (3+1)-dimensional [(3+1)D] Z2-gauge theory with emergent fermionic Kramers doublet. We show that the emergence of such a fermion will cause the nucleation of certain topological excitations in space-time without pin+ structure. The exactly soluble model also leads to a statistical transmutation in (3+1)D. In addition, we construct exactly soluble bosonic models that realize 2 types of time-reversal symmetry-enriched Z2 topological orders in 2+1 dimensions, and 20 types of simplest time-reversal symmetry-enriched topological (SET) orders which have only one nontrivial pointlike and stringlike topological excitation. Many physical properties of those topological states are calculated using the exactly soluble models. We find that some time-reversal SET orders have pointlike excitations that carry Kramers doublet, a fractionalized time-reversal symmetry. We also find that some Z2 SET orders have stringlike excitations that carry anomalous (nononsite) Z2 symmetry, which can be viewed as a fractionalization of Z2 symmetry on strings. Our construction is based on cochains and cocycles in algebraic topology, which is very versatile. In principle, it can also realize emergent topological field theory beyond the twisted gauge theory.

  12. Inflow forecasting model construction with stochastic time series for coordinated dam operation

    NASA Astrophysics Data System (ADS)

    Kim, T.; Jung, Y.; Kim, H.; Heo, J. H.

    2014-12-01

    Dam inflow forecasting is one of the most important tasks in dam operation for an effective water resources management and control. In general, dam inflow forecasting with stochastic time series model is possible to apply when the data is stationary because most of stochastic process based on stationarity. However, recent hydrological data cannot be satisfied the stationarity anymore because of climate change. Therefore a stochastic time series model, which can consider seasonality and trend in the data series, named SARIMAX(Seasonal Autoregressive Integrated Average with eXternal variable) model were constructed in this study. This SARIMAX model could increase the performance of stochastic time series model by considering the nonstationarity components and external variable such as precipitation. For application, the models were constructed for four coordinated dams on Han river in South Korea with monthly time series data. As a result, the models of each dam have similar performance and it would be possible to use the model for coordinated dam operation.Acknowledgement This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-NH-12-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.

  13. Real-time emissions from construction equipment compared with model predictions.

    PubMed

    Heidari, Bardia; Marr, Linsey C

    2015-02-01

    The construction industry is a large source of greenhouse gases and other air pollutants. Measuring and monitoring real-time emissions will provide practitioners with information to assess environmental impacts and improve the sustainability of construction. We employed a portable emission measurement system (PEMS) for real-time measurement of carbon dioxide (CO), nitrogen oxides (NOx), hydrocarbon, and carbon monoxide (CO) emissions from construction equipment to derive emission rates (mass of pollutant emitted per unit time) and emission factors (mass of pollutant emitted per unit volume of fuel consumed) under real-world operating conditions. Measurements were compared with emissions predicted by methodologies used in three models: NONROAD2008, OFFROAD2011, and a modal statistical model. Measured emission rates agreed with model predictions for some pieces of equipment but were up to 100 times lower for others. Much of the difference was driven by lower fuel consumption rates than predicted. Emission factors during idling and hauling were significantly different from each other and from those of other moving activities, such as digging and dumping. It appears that operating conditions introduce considerable variability in emission factors. Results of this research will aid researchers and practitioners in improving current emission estimation techniques, frameworks, and databases.

  14. Constructing service-oriented architecture adoption maturity matrix using Kano model

    NASA Astrophysics Data System (ADS)

    Hamzah, Mohd Hamdi Irwan; Baharom, Fauziah; Mohd, Haslina

    2017-10-01

    Commonly, organizations adopted Service-Oriented Architecture (SOA) because it can provide a flexible reconfiguration and can reduce the development time and cost. In order to guide the SOA adoption, previous industry and academia have constructed SOA maturity model. However, there is a limited number of works on how to construct the matrix in the previous SOA maturity model. Therefore, this study is going to provide a method that can be used in order to construct the matrix in the SOA maturity model. This study adapts Kano Model to construct the cross evaluation matrix focused on SOA adoption IT and business benefits. This study found that Kano Model can provide a suitable and appropriate method for constructing the cross evaluation matrix in SOA maturity model. Kano model also can be used to plot, organize and better represent the evaluation dimension for evaluating the SOA adoption.

  15. Design/build vs traditional construction user delay modeling : an evaluation of the cost effectiveness of innovative construction methods for new construction. Part 2 : VISUM Online for Salt Lake, Davis, and Utah Counties

    DOT National Transportation Integrated Search

    2007-05-01

    VISUM Online is a traffic management system for processing online traffic data. The system implements both a road network model and a traffic demand model. VISUM Online uses all available real-time and historic data to calculate current and forecaste...

  16. A univariate model of river water nitrate time series

    NASA Astrophysics Data System (ADS)

    Worrall, F.; Burt, T. P.

    1999-01-01

    Four time series were taken from three catchments in the North and South of England. The sites chosen included two in predominantly agricultural catchments, one at the tidal limit and one downstream of a sewage treatment works. A time series model was constructed for each of these series as a means of decomposing the elements controlling river water nitrate concentrations and to assess whether this approach could provide a simple management tool for protecting water abstractions. Autoregressive (AR) modelling of the detrended and deseasoned time series showed a "memory effect". This memory effect expressed itself as an increase in the winter-summer difference in nitrate levels that was dependent upon the nitrate concentration 12 or 6 months previously. Autoregressive moving average (ARMA) modelling showed that one of the series contained seasonal, non-stationary elements that appeared as an increasing trend in the winter-summer difference. The ARMA model was used to predict nitrate levels and predictions were tested against data held back from the model construction process - predictions gave average percentage errors of less than 10%. Empirical modelling can therefore provide a simple, efficient method for constructing management models for downstream water abstraction.

  17. Constructing the reduced dynamical models of interannual climate variability from spatial-distributed time series

    NASA Astrophysics Data System (ADS)

    Mukhin, Dmitry; Gavrilov, Andrey; Loskutov, Evgeny; Feigin, Alexander

    2016-04-01

    We suggest a method for empirical forecast of climate dynamics basing on the reconstruction of reduced dynamical models in a form of random dynamical systems [1,2] derived from observational time series. The construction of proper embedding - the set of variables determining the phase space the model works in - is no doubt the most important step in such a modeling, but this task is non-trivial due to huge dimension of time series of typical climatic fields. Actually, an appropriate expansion of observational time series is needed yielding the number of principal components considered as phase variables, which are to be efficient for the construction of low-dimensional evolution operator. We emphasize two main features the reduced models should have for capturing the main dynamical properties of the system: (i) taking into account time-lagged teleconnections in the atmosphere-ocean system and (ii) reflecting the nonlinear nature of these teleconnections. In accordance to these principles, in this report we present the methodology which includes the combination of a new way for the construction of an embedding by the spatio-temporal data expansion and nonlinear model construction on the basis of artificial neural networks. The methodology is aplied to NCEP/NCAR reanalysis data including fields of sea level pressure, geopotential height, and wind speed, covering Northern Hemisphere. Its efficiency for the interannual forecast of various climate phenomena including ENSO, PDO, NAO and strong blocking event condition over the mid latitudes, is demonstrated. Also, we investigate the ability of the models to reproduce and predict the evolution of qualitative features of the dynamics, such as spectral peaks, critical transitions and statistics of extremes. This research was supported by the Government of the Russian Federation (Agreement No. 14.Z50.31.0033 with the Institute of Applied Physics RAS) [1] Y. I. Molkov, E. M. Loskutov, D. N. Mukhin, and A. M. Feigin, "Random dynamical models from time series," Phys. Rev. E, vol. 85, no. 3, p. 036216, 2012. [2] D. Mukhin, D. Kondrashov, E. Loskutov, A. Gavrilov, A. Feigin, and M. Ghil, "Predicting Critical Transitions in ENSO models. Part II: Spatially Dependent Models," J. Clim., vol. 28, no. 5, pp. 1962-1976, 2015.

  18. The reservoir model: a differential equation model of psychological regulation.

    PubMed

    Deboeck, Pascal R; Bergeman, C S

    2013-06-01

    Differential equation models can be used to describe the relationships between the current state of a system of constructs (e.g., stress) and how those constructs are changing (e.g., based on variable-like experiences). The following article describes a differential equation model based on the concept of a reservoir. With a physical reservoir, such as one for water, the level of the liquid in the reservoir at any time depends on the contributions to the reservoir (inputs) and the amount of liquid removed from the reservoir (outputs). This reservoir model might be useful for constructs such as stress, where events might "add up" over time (e.g., life stressors, inputs), but individuals simultaneously take action to "blow off steam" (e.g., engage coping resources, outputs). The reservoir model can provide descriptive statistics of the inputs that contribute to the "height" (level) of a construct and a parameter that describes a person's ability to dissipate the construct. After discussing the model, we describe a method of fitting the model as a structural equation model using latent differential equation modeling and latent distribution modeling. A simulation study is presented to examine recovery of the input distribution and output parameter. The model is then applied to the daily self-reports of negative affect and stress from a sample of older adults from the Notre Dame Longitudinal Study on Aging. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  19. The Reservoir Model: A Differential Equation Model of Psychological Regulation

    PubMed Central

    Deboeck, Pascal R.; Bergeman, C. S.

    2017-01-01

    Differential equation models can be used to describe the relationships between the current state of a system of constructs (e.g., stress) and how those constructs are changing (e.g., based on variable-like experiences). The following article describes a differential equation model based on the concept of a reservoir. With a physical reservoir, such as one for water, the level of the liquid in the reservoir at any time depends on the contributions to the reservoir (inputs) and the amount of liquid removed from the reservoir (outputs). This reservoir model might be useful for constructs such as stress, where events might “add up” over time (e.g., life stressors, inputs), but individuals simultaneously take action to “blow off steam” (e.g., engage coping resources, outputs). The reservoir model can provide descriptive statistics of the inputs that contribute to the “height” (level) of a construct and a parameter that describes a person's ability to dissipate the construct. After discussing the model, we describe a method of fitting the model as a structural equation model using latent differential equation modeling and latent distribution modeling. A simulation study is presented to examine recovery of the input distribution and output parameter. The model is then applied to the daily self-reports of negative affect and stress from a sample of older adults from the Notre Dame Longitudinal Study on Aging. PMID:23527605

  20. Models of resource planning during formation of calendar construction plans for erection of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Pocebneva, Irina; Belousov, Vadim; Fateeva, Irina

    2018-03-01

    This article provides a methodical description of resource-time analysis for a wide range of requirements imposed for resource consumption processes in scheduling tasks during the construction of high-rise buildings and facilities. The core of the proposed approach and is the resource models being determined. The generalized network models are the elements of those models, the amount of which can be too large to carry out the analysis of each element. Therefore, the problem is to approximate the original resource model by simpler time models, when their amount is not very large.

  1. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  2. A Sketching Interface for Freeform 3D Modeling

    NASA Astrophysics Data System (ADS)

    Igarashi, Takeo

    This chapter introduces Teddy, a sketch-based modeling system to quickly and easily design freeform models such as stuffed animals and other rotund objects. The user draws several 2D freeform strokes interactively on the screen and the system automatically constructs plausible 3D polygonal surfaces. Our system supports several modeling operations, including the operation to construct a 3D polygonal surface from a 2D silhouette drawn by the user: it inflates the region surrounded by the silhouette making a wide area fat, and a narrow area thin. Teddy, our prototype system, is implemented as a Java program, and the mesh construction is done in real-time on a standard PC. Our informal user study showed that a first-time user masters the operations within 10 minutes, and can construct interesting 3D models within minutes. We also report the result of a case study where a high school teacher taught various 3D concepts in geography using the system.

  3. Optimizing and controlling earthmoving operations using spatial technologies

    NASA Astrophysics Data System (ADS)

    Alshibani, Adel

    This thesis presents a model designed for optimizing, tracking, and controlling earthmoving operations. The proposed model utilizes, Genetic Algorithm (GA), Linear Programming (LP), and spatial technologies including Global Positioning Systems (GPS) and Geographic Information Systems (GIS) to support the management functions of the developed model. The model assists engineers and contractors in selecting near optimum crew formations in planning phase and during construction, using GA and LP supported by the Pathfinder Algorithm developed in a GIS environment. GA is used in conjunction with a set of rules developed to accelerate the optimization process and to avoid generating and evaluating hypothetical and unrealistic crew formations. LP is used to determine quantities of earth to be moved from different borrow pits and to be placed at different landfill sites to meet project constraints and to minimize the cost of these earthmoving operations. On the one hand, GPS is used for onsite data collection and for tracking construction equipment in near real-time. On the other hand, GIS is employed to automate data acquisition and to analyze the collected spatial data. The model is also capable of reconfiguring crew formations dynamically during the construction phase while site operations are in progress. The optimization of the crew formation considers: (1) construction time, (2) construction direct cost, or (3) construction total cost. The model is also capable of generating crew formations to meet, as close as possible, specified time and/or cost constraints. In addition, the model supports tracking and reporting of project progress utilizing the earned-value concept and the project ratio method with modifications that allow for more accurate forecasting of project time and cost at set future dates and at completion. The model is capable of generating graphical and tabular reports. The developed model has been implemented in prototype software, using Object-Oriented Programming, Microsoft Foundation Classes (MFC), and has been coded using visual C++ V.6. Microsoft Access is employed as database management system. The developed software operates in Microsoft windows' environment. Three example applications were analyzed to validate the development made and to illustrate the essential features of the developed model.

  4. Systems engineering studies of on-orbit assembly operation

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.

    1991-01-01

    While the practice of construction has a long history, the underlying theory of construction is relatively young. Very little has been documented as to techniques of logistic support, construction planning, construction scheduling, construction testing, and inspection. The lack of 'systems approaches' to construction processes is certainly one of the most serious roadblocks to the construction of space structures. System engineering research efforts at CSC are aimed at developing concepts and tools which contribute to a systems theory of space construction. The research is also aimed at providing means for trade-offs of design parameters for other research areas in CSC. Systems engineering activity at CSC has divided space construction into the areas of orbital assembly, lunar base construction, interplanetary transport vehicle construction, and Mars base construction. A brief summary of recent results is given. Several models for 'launch-on-time' were developed. Launch-on-time is a critical concept to the assembly of such Earth-orbiting structures as the Space Station Freedom, and to planetary orbiters such as the Mars transfer vehicle. CSC has developed a launch vehicle selection model which uses linear programming to find optimal combinations of launch vehicles of various sizes (Atlas, Titan, Shuttles, HLLV's) to support SEI missions. Recently, the Center developed a cost trade-off model for studying on orbit assembly logistics. With this model it was determined that the most effective size of the HLLV would be in the range of 120 to 200 metric tons to LEO, which is consistent with the choices of General Stafford's Synthesis Group Report. A second-generation Dynamic Construction Activities Model ('DYCAM') process model has been under development, based on our past results in interruptability and our initial DYCAM model. This second-generation model is built on the paradigm of knowledge-based expert systems. It is aimed at providing answers to two questions: (1) what are some necessary or sufficient conditions for judging conceptual designs of spacecraft?, and (2) can a methodology be formulated such that these conditions may be used to provide computer-aided tools for evaluating conceptual designs and planning for space assembly sequences? Early simulation results indicate that the DYCAM model has a clear ability to emulate and simulate human orbital construction processes.

  5. A Vernacular for Linear Latent Growth Models

    ERIC Educational Resources Information Center

    Hancock, Gregory R.; Choi, Jaehwa

    2006-01-01

    In its most basic form, latent growth modeling (latent curve analysis) allows an assessment of individuals' change in a measured variable X over time. For simple linear models, as with other growth models, parameter estimates associated with the a construct (amount of X at a chosen temporal reference point) and b construct (growth in X per unit…

  6. Cognitive Mediation of Rape's Mental Health Impact: Constructive Replication of a Cross-Sectional Model in Longitudinal Data

    ERIC Educational Resources Information Center

    Koss, Mary P.; Figueredo, Aurelio Jose

    2004-01-01

    The constructive replication of a prespecified, cognitively mediated model of rape's impact on psychosocial health is reported using longitudinal data (see Koss, Figueredo, & Prince, 2002, for a summary of model development). Rape survivors (n= 59) were assessed four times, 3 to 24 months postrape. Structural equations modeling of baseline data…

  7. Preliminary study on enhancing waste management best practice model in Malaysia construction industry

    NASA Astrophysics Data System (ADS)

    Jamaludin, Amril Hadri; Karim, Nurulzatushima Abdul; Noor, Raja Nor Husna Raja Mohd; Othman, Nurulhidayah; Malik, Sulaiman Abdul

    2017-08-01

    Construction waste management (CWM) is the practice of minimizing and diverting construction waste, demolition debris, and land-clearing debris from disposal and redirecting recyclable resources back into the construction process. Best practice model means best choice from the collection of other practices that was built for purpose of construction waste management. The practice model can help the contractors in minimizing waste before the construction activities will be started. The importance of minimizing wastage will have direct impact on time, cost and quality of a construction project. This paper is focusing on the preliminary study to determine the factors of waste generation in the construction sites and identify the effectiveness of existing construction waste management practice conducted in Malaysia. The paper will also include the preliminary works of planned research location, data collection method, and analysis to be done by using the Analytical Hierarchy Process (AHP) to help in developing suitable waste management best practice model that can be used in the country.

  8. Work-family conflict, emotional exhaustion and performance-based self-esteem: reciprocal relationships.

    PubMed

    Richter, Anne; Schraml, Karin; Leineweber, Constanze

    2015-01-01

    The three constructs of work-family conflict, emotional exhaustion and performance-based self-esteem are all related to tremendous negative consequences for the individual, the organization as well as for society. Even though there are studies that connect two of those constructs, the prospective relations between all three of them have not been studied yet. We explored the prospective relations between the three constructs in a large Swedish data set representative of the Swedish workforce. Gender differences in the relations were investigated. Longitudinal data with a 2-year time lag were gathered from 3,387 working men and women who responded to the 2006 and 2008 waves of the Swedish Longitudinal Occupational Survey of Health. Four different cross-lagged models were analysed. In the best fitting model, higher levels of work-family conflict at time 1 were associated with an increased level of performance-based self-esteem at time 2, but not with emotional exhaustion, after controlling for having children, gender, education and age. Also, relationships between emotional exhaustion at time 1 and work-family conflict and performance-based self-esteem at time 2 could be established. Furthermore, relationships between performance-based self-esteem time 1 and work-family conflict and emotional exhaustion time 2 were found. Multiple-group analysis did not show any differences in the relations of the tested constructs over time for either men or women. We conclude that the three constructs are interrelated and best understood through a reciprocal model. No differences were found between men and women.

  9. Morphing the feature-based multi-blocks of normative/healthy vertebral geometries to scoliosis vertebral geometries: development of personalized finite element models.

    PubMed

    Hadagali, Prasannaah; Peters, James R; Balasubramanian, Sriram

    2018-03-01

    Personalized Finite Element (FE) models and hexahedral elements are preferred for biomechanical investigations. Feature-based multi-block methods are used to develop anatomically accurate personalized FE models with hexahedral mesh. It is tedious to manually construct multi-blocks for large number of geometries on an individual basis to develop personalized FE models. Mesh-morphing method mitigates the aforementioned tediousness in meshing personalized geometries every time, but leads to element warping and loss of geometrical data. Such issues increase in magnitude when normative spine FE model is morphed to scoliosis-affected spinal geometry. The only way to bypass the issue of hex-mesh distortion or loss of geometry as a result of morphing is to rely on manually constructing the multi-blocks for scoliosis-affected spine geometry of each individual, which is time intensive. A method to semi-automate the construction of multi-blocks on the geometry of scoliosis vertebrae from the existing multi-blocks of normative vertebrae is demonstrated in this paper. High-quality hexahedral elements were generated on the scoliosis vertebrae from the morphed multi-blocks of normative vertebrae. Time taken was 3 months to construct the multi-blocks for normative spine and less than a day for scoliosis. Efforts taken to construct multi-blocks on personalized scoliosis spinal geometries are significantly reduced by morphing existing multi-blocks.

  10. PENDISC: a simple method for constructing a mathematical model from time-series data of metabolite concentrations.

    PubMed

    Sriyudthsak, Kansuporn; Iwata, Michio; Hirai, Masami Yokota; Shiraishi, Fumihide

    2014-06-01

    The availability of large-scale datasets has led to more effort being made to understand characteristics of metabolic reaction networks. However, because the large-scale data are semi-quantitative, and may contain biological variations and/or analytical errors, it remains a challenge to construct a mathematical model with precise parameters using only these data. The present work proposes a simple method, referred to as PENDISC (Parameter Estimation in a N on- DImensionalized S-system with Constraints), to assist the complex process of parameter estimation in the construction of a mathematical model for a given metabolic reaction system. The PENDISC method was evaluated using two simple mathematical models: a linear metabolic pathway model with inhibition and a branched metabolic pathway model with inhibition and activation. The results indicate that a smaller number of data points and rate constant parameters enhances the agreement between calculated values and time-series data of metabolite concentrations, and leads to faster convergence when the same initial estimates are used for the fitting. This method is also shown to be applicable to noisy time-series data and to unmeasurable metabolite concentrations in a network, and to have a potential to handle metabolome data of a relatively large-scale metabolic reaction system. Furthermore, it was applied to aspartate-derived amino acid biosynthesis in Arabidopsis thaliana plant. The result provides confirmation that the mathematical model constructed satisfactorily agrees with the time-series datasets of seven metabolite concentrations.

  11. Improved treatment of optics in the Lindquist-Wheeler models

    NASA Astrophysics Data System (ADS)

    Clifton, Timothy; Ferreira, Pedro G.; O'Donnell, Kane

    2012-01-01

    We consider the optical properties of Lindquist-Wheeler (LW) models of the Universe. These models consist of lattices constructed from regularly arranged discrete masses. They are akin to the Wigner-Seitz construction of solid state physics, and result in a dynamical description of the large-scale Universe in which the global expansion is given by a Friedmann-like equation. We show that if these models are constructed in a particular way then the redshifts of distant objects, as well as the dynamics of the global space-time, can be made to be in good agreement with the homogeneous and isotropic Friedmann-Lemaître-Robertson-Walker (FLRW) solutions of Einstein’s equations, at the level of ≲3% out to z≃2. Angular diameter and luminosity distances, on the other hand, differ from those found in the corresponding FLRW models, while being consistent with the “empty beam” approximation, together with the shearing effects due to the nearest masses. This can be compared with the large deviations found from the corresponding FLRW values obtained in a previous study that considered LW models constructed in a different way. We therefore advocate the improved LW models we consider here as useful constructions that appear to faithfully reproduce both the dynamical and observational properties of space-times containing discrete masses.

  12. 4D modeling in high-rise construction

    NASA Astrophysics Data System (ADS)

    Balakina, Anastasiya; Simankina, Tatyana; Lukinov, Vitaly

    2018-03-01

    High-rise construction is a complex construction process, requiring the use of more perfected and sophisticated tools for design, planning and construction management. The use of BIM-technologies allows minimizing the risks associated with design errors and errors that occur during construction. This article discusses a visual planning method using the 4D model, which allows the project team to create an accurate and complete construction plan, which is much more difficult to achieve with the help of traditional planning methods. The use of the 4D model in the construction of a 70-story building allowed to detect spatial and temporal errors before the start of construction work. In addition to identifying design errors, 4D modeling has allowed to optimize the construction, as follows: to optimize the operation of cranes, the placement of building structures and materials at various stages of construction, to optimize the organization of work performance, as well as to monitor the activities related to the preparation of the construction site for compliance with labor protection and safety requirements, which resulted in saving money and time.

  13. Construction of Discrete Time Shadow Price

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogala, Tomasz, E-mail: rogalatp@gmail.com; Stettner, Lukasz, E-mail: stettner@impan.pl

    2015-12-15

    In the paper expected utility from consumption over finite time horizon for discrete time markets with bid and ask prices and strictly concave utility function is considered. The notion of weak shadow price, i.e. an illiquid price, depending on the portfolio, under which the model without bid and ask price is equivalent to the model with bid and ask price is introduced. Existence and the form of weak shadow price is shown. Using weak shadow price usual (called in the paper strong) shadow price is then constructed.

  14. Simultaneous construction of PCR-DGGE-based predictive models of Listeria monocytogenes and Vibrio parahaemolyticus on cooked shrimps.

    PubMed

    Liao, C; Peng, Z Y; Li, J B; Cui, X W; Zhang, Z H; Malakar, P K; Zhang, W J; Pan, Y J; Zhao, Y

    2015-03-01

    The aim of this study was to simultaneously construct PCR-DGGE-based predictive models of Listeria monocytogenes and Vibrio parahaemolyticus on cooked shrimps at 4 and 10°C. Calibration curves were established to correlate peak density of DGGE bands with microbial counts. Microbial counts derived from PCR-DGGE and plate methods were fitted by Baranyi model to obtain molecular and traditional predictive models. For L. monocytogenes, growing at 4 and 10°C, molecular predictive models were constructed. It showed good evaluations of correlation coefficients (R(2) > 0.92), bias factors (Bf ) and accuracy factors (Af ) (1.0 ≤ Bf ≤ Af ≤ 1.1). Moreover, no significant difference was found between molecular and traditional predictive models when analysed on lag phase (λ), maximum growth rate (μmax ) and growth data (P > 0.05). But for V. parahaemolyticus, inactivated at 4 and 10°C, molecular models show significant difference when compared with traditional models. Taken together, these results suggest that PCR-DGGE based on DNA can be used to construct growth models, but it is inappropriate for inactivation models yet. This is the first report of developing PCR-DGGE to simultaneously construct multiple molecular models. It has been known for a long time that microbial predictive models based on traditional plate methods are time-consuming and labour-intensive. Denaturing gradient gel electrophoresis (DGGE) has been widely used as a semiquantitative method to describe complex microbial community. In our study, we developed DGGE to quantify bacterial counts and simultaneously established two molecular predictive models to describe the growth and survival of two bacteria (Listeria monocytogenes and Vibrio parahaemolyticus) at 4 and 10°C. We demonstrated that PCR-DGGE could be used to construct growth models. This work provides a new approach to construct molecular predictive models and thereby facilitates predictive microbiology and QMRA (Quantitative Microbial Risk Assessment). © 2014 The Society for Applied Microbiology.

  15. Time on Your Hands: Modeling Time

    ERIC Educational Resources Information Center

    Finson, Kevin; Beaver, John

    2007-01-01

    Building physical models relative to a concept can be an important activity to help students develop and manipulate abstract ideas and mental models that often prove difficult to grasp. One such concept is "time". A method for helping students understand the cyclical nature of time involves the construction of a Time Zone Calculator through a…

  16. Dynamic Factor Analysis Models with Time-Varying Parameters

    ERIC Educational Resources Information Center

    Chow, Sy-Miin; Zu, Jiyun; Shifren, Kim; Zhang, Guangjian

    2011-01-01

    Dynamic factor analysis models with time-varying parameters offer a valuable tool for evaluating multivariate time series data with time-varying dynamics and/or measurement properties. We use the Dynamic Model of Activation proposed by Zautra and colleagues (Zautra, Potter, & Reich, 1997) as a motivating example to construct a dynamic factor…

  17. The effectiveness of element downsizing on a three-dimensional finite element model of bone trabeculae in implant biomechanics.

    PubMed

    Sato, Y; Wadamoto, M; Tsuga, K; Teixeira, E R

    1999-04-01

    More validity of finite element analysis in implant biomechanics requires element downsizing. However, excess downsizing needs computer memory and calculation time. To investigate the effectiveness of element downsizing on the construction of a three-dimensional finite element bone trabeculae model, with different element sizes (600, 300, 150 and 75 microm) models were constructed and stress induced by vertical 10 N loading was analysed. The difference in von Mises stress values between the models with 600 and 300 microm element sizes was larger than that between 300 and 150 microm. On the other hand, no clear difference of stress values was detected among the models with 300, 150 and 75 microm element sizes. Downsizing of elements from 600 to 300 microm is suggested to be effective in the construction of a three-dimensional finite element bone trabeculae model for possible saving of computer memory and calculation time in the laboratory.

  18. Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework

    PubMed Central

    Nahum-Shani, Inbal; Hekler, Eric B.; Spruijt-Metz, Donna

    2016-01-01

    Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)--suites of interventions that adapt over time to an individual’s changing status and circumstances with the goal to address the individual’s need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. PMID:26651462

  19. Integrability and chemical potential in the (3 + 1)-dimensional Skyrme model

    NASA Astrophysics Data System (ADS)

    Alvarez, P. D.; Canfora, F.; Dimakis, N.; Paliathanasis, A.

    2017-10-01

    Using a remarkable mapping from the original (3 + 1)dimensional Skyrme model to the Sine-Gordon model, we construct the first analytic examples of Skyrmions as well as of Skyrmions-anti-Skyrmions bound states within a finite box in 3 + 1 dimensional flat space-time. An analytic upper bound on the number of these Skyrmions-anti-Skyrmions bound states is derived. We compute the critical isospin chemical potential beyond which these Skyrmions cease to exist. With these tools, we also construct topologically protected time-crystals: time-periodic configurations whose time-dependence is protected by their non-trivial winding number. These are striking realizations of the ideas of Shapere and Wilczek. The critical isospin chemical potential for these time-crystals is determined.

  20. Development and Validation of a 2 x 2 Model of Time-Related Academic Behavior: Procrastination and Timely Engagement

    ERIC Educational Resources Information Center

    Strunk, Kamden K.; Cho, YoonJung; Steele, Misty R.; Bridges, Stacey L.

    2013-01-01

    Procrastination is an educational concern for classroom instructors because of its negative psychological and academic impacts on students. However, the traditional view of procrastination as a unidimensional construct is insufficient in two regards. First, the construct needs to be viewed more broadly as time-related academic behavior,…

  1. Simplified hydraulic model of French vertical-flow constructed wetlands.

    PubMed

    Arias, Luis; Bertrand-Krajewski, Jean-Luc; Molle, Pascal

    2014-01-01

    Designing vertical-flow constructed wetlands (VFCWs) to treat both rain events and dry weather flow is a complex task due to the stochastic nature of rain events. Dynamic models can help to improve design, but they usually prove difficult to handle for designers. This study focuses on the development of a simplified hydraulic model of French VFCWs using an empirical infiltration coefficient--infiltration capacity parameter (ICP). The model was fitted using 60-second-step data collected on two experimental French VFCW systems and compared with Hydrus 1D software. The model revealed a season-by-season evolution of the ICP that could be explained by the mechanical role of reeds. This simplified model makes it possible to define time-course shifts in ponding time and outlet flows. As ponding time hinders oxygen renewal, thus impacting nitrification and organic matter degradation, ponding time limits can be used to fix a reliable design when treating both dry and rain events.

  2. Time till death affects spider mobility and web-building behavior during web construction in an orb-web spider.

    PubMed

    Anotaux, Mylène; Toscani, Camille; Leborgne, Raymond; Chaline, Nicolas; Pasquet, Alain

    2016-04-01

    It is well known that age influences organism mobility. This was demonstrated in vertebrates (such as mammals and birds) but has been less studied in invertebrates with the exception of Drosophila and the nematode Caenorhabditis elegans. Here we studied the influence of age on the mobility of the orb-weaving spider Zygiella x-notata during web construction. The orb-web is a good model because it has a characteristic geometrical structure and video tracking can be used to easily follow the spider's movements during web building. We investigated the influence of age (specifically chronological age, life span, and time till death) on different parameters of spider mobility during the construction of the capture spiral (distance traveled, duration of construction, spider velocity, spider movement, and spider inactivity) with a generalized linear model (GLM) procedure adjusted for the spider mass. The results showed that neither chronological age, nor life span affected the mobility parameters. However, when the time till death decreased, there was a decrease in the distance traveled, the duration of the construction of the capture spiral, and the spider movement. The spider velocity and the time of inactivity were not affected. These results could be correlated with a decrease in the length of the silky thread deposited for the construction of the capture spiral. Spiders with a shorter time till death built smaller web using less silk. Thus, our study suggests strongly that time till death affects spider mobility during web construction but not the chronological age and thus may be a good indicator of senescence.

  3. Time till death affects spider mobility and web-building behavior during web construction in an orb-web spider

    PubMed Central

    Anotaux, Mylène; Toscani, Camille; Leborgne, Raymond; Chaline, Nicolas; Pasquet, Alain

    2016-01-01

    Abstract It is well known that age influences organism mobility. This was demonstrated in vertebrates (such as mammals and birds) but has been less studied in invertebrates with the exception of Drosophila and the nematode Caenorhabditis elegans. Here we studied the influence of age on the mobility of the orb-weaving spider Zygiella x-notata during web construction. The orb-web is a good model because it has a characteristic geometrical structure and video tracking can be used to easily follow the spider’s movements during web building. We investigated the influence of age (specifically chronological age, life span, and time till death) on different parameters of spider mobility during the construction of the capture spiral (distance traveled, duration of construction, spider velocity, spider movement, and spider inactivity) with a generalized linear model (GLM) procedure adjusted for the spider mass. The results showed that neither chronological age, nor life span affected the mobility parameters. However, when the time till death decreased, there was a decrease in the distance traveled, the duration of the construction of the capture spiral, and the spider movement. The spider velocity and the time of inactivity were not affected. These results could be correlated with a decrease in the length of the silky thread deposited for the construction of the capture spiral. Spiders with a shorter time till death built smaller web using less silk. Thus, our study suggests strongly that time till death affects spider mobility during web construction but not the chronological age and thus may be a good indicator of senescence. PMID:29491899

  4. Constructing and predicting solitary pattern solutions for nonlinear time-fractional dispersive partial differential equations

    NASA Astrophysics Data System (ADS)

    Arqub, Omar Abu; El-Ajou, Ahmad; Momani, Shaher

    2015-07-01

    Building fractional mathematical models for specific phenomena and developing numerical or analytical solutions for these fractional mathematical models are crucial issues in mathematics, physics, and engineering. In this work, a new analytical technique for constructing and predicting solitary pattern solutions of time-fractional dispersive partial differential equations is proposed based on the generalized Taylor series formula and residual error function. The new approach provides solutions in the form of a rapidly convergent series with easily computable components using symbolic computation software. For method evaluation and validation, the proposed technique was applied to three different models and compared with some of the well-known methods. The resultant simulations clearly demonstrate the superiority and potentiality of the proposed technique in terms of the quality performance and accuracy of substructure preservation in the construct, as well as the prediction of solitary pattern solutions for time-fractional dispersive partial differential equations.

  5. Implicit Three-Dimensional Geo-Modelling Based on HRBF Surface

    NASA Astrophysics Data System (ADS)

    Gou, J.; Zhou, W.; Wu, L.

    2016-10-01

    Three-dimensional (3D) geological models are important representations of the results of regional geological surveys. However, the process of constructing 3D geological models from two-dimensional (2D) geological elements remains difficult and time-consuming. This paper proposes a method of migrating from 2D elements to 3D models. First, the geological interfaces were constructed using the Hermite Radial Basis Function (HRBF) to interpolate the boundaries and attitude data. Then, the subsurface geological bodies were extracted from the spatial map area using the Boolean method between the HRBF surface and the fundamental body. Finally, the top surfaces of the geological bodies were constructed by coupling the geological boundaries to digital elevation models. Based on this workflow, a prototype system was developed, and typical geological structures (e.g., folds, faults, and strata) were simulated. Geological modes were constructed through this workflow based on realistic regional geological survey data. For extended applications in 3D modelling of other kinds of geo-objects, mining ore body models and urban geotechnical engineering stratum models were constructed by this method from drill-hole data. The model construction process was rapid, and the resulting models accorded with the constraints of the original data.

  6. Predicting adsorptive removal of chlorophenol from aqueous solution using artificial intelligence based modeling approaches.

    PubMed

    Singh, Kunwar P; Gupta, Shikha; Ojha, Priyanka; Rai, Premanjali

    2013-04-01

    The research aims to develop artificial intelligence (AI)-based model to predict the adsorptive removal of 2-chlorophenol (CP) in aqueous solution by coconut shell carbon (CSC) using four operational variables (pH of solution, adsorbate concentration, temperature, and contact time), and to investigate their effects on the adsorption process. Accordingly, based on a factorial design, 640 batch experiments were conducted. Nonlinearities in experimental data were checked using Brock-Dechert-Scheimkman (BDS) statistics. Five nonlinear models were constructed to predict the adsorptive removal of CP in aqueous solution by CSC using four variables as input. Performances of the constructed models were evaluated and compared using statistical criteria. BDS statistics revealed strong nonlinearity in experimental data. Performance of all the models constructed here was satisfactory. Radial basis function network (RBFN) and multilayer perceptron network (MLPN) models performed better than generalized regression neural network, support vector machines, and gene expression programming models. Sensitivity analysis revealed that the contact time had highest effect on adsorption followed by the solution pH, temperature, and CP concentration. The study concluded that all the models constructed here were capable of capturing the nonlinearity in data. A better generalization and predictive performance of RBFN and MLPN models suggested that these can be used to predict the adsorption of CP in aqueous solution using CSC.

  7. Modeling job sites in real time to improve safety during equipment operation

    NASA Astrophysics Data System (ADS)

    Caldas, Carlos H.; Haas, Carl T.; Liapi, Katherine A.; Teizer, Jochen

    2006-03-01

    Real-time three-dimensional (3D) modeling of work zones has received an increasing interest to perform equipment operation faster, safer and more precisely. In addition, hazardous job site environment like they exist on construction sites ask for new devices which can rapidly and actively model static and dynamic objects. Flash LADAR (Laser Detection and Ranging) cameras are one of the recent technology developments which allow rapid spatial data acquisition of scenes. Algorithms that can process and interpret the output of such enabling technologies into threedimensional models have the potential to significantly improve work processes. One particular important application is modeling the location and path of objects in the trajectory of heavy construction equipment navigation. Detecting and mapping people, materials and equipment into a three-dimensional computer model allows analyzing the location, path, and can limit or restrict access to hazardous areas. This paper presents experiments and results of a real-time three-dimensional modeling technique to detect static and moving objects within the field of view of a high-frame update rate laser range scanning device. Applications related to heavy equipment operations on transportation and construction job sites are specified.

  8. Application of statistical distribution theory to launch-on-time for space construction logistic support

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.

    1989-01-01

    The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.

  9. A novel methodology to estimate the evolution of construction waste in construction sites.

    PubMed

    Katz, Amnon; Baum, Hadassa

    2011-02-01

    This paper focuses on the accumulation of construction waste generated throughout the erection of new residential buildings. A special methodology was developed in order to provide a model that will predict the flow of construction waste. The amount of waste and its constituents, produced on 10 relatively large construction sites (7000-32,000 m(2) of built area) was monitored periodically for a limited time. A model that predicts the accumulation of construction waste was developed based on these field observations. According to the model, waste accumulates in an exponential manner, i.e. smaller amounts are generated during the early stages of construction and increasing amounts are generated towards the end of the project. The total amount of waste from these sites was estimated at 0.2m(3) per 1m(2) floor area. A good correlation was found between the model predictions and actual data from the field survey. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. Order reduction for a model of marine bacteriophage evolution

    NASA Astrophysics Data System (ADS)

    Pagliarini, Silvia; Korobeinikov, Andrei

    2017-02-01

    A typical mechanistic model of viral evolution necessary includes several time scales which can differ by orders of magnitude. Such a diversity of time scales makes analysis of these models difficult. Reducing the order of a model is highly desirable when handling such a model. A typical approach applied to such slow-fast (or singularly perturbed) systems is the time scales separation technique. Constructing the so-called quasi-steady-state approximation is the usual first step in applying the technique. While this technique is commonly applied, in some cases its straightforward application can lead to unsatisfactory results. In this paper we construct the quasi-steady-state approximation for a model of evolution of marine bacteriophages based on the Beretta-Kuang model. We show that for this particular model the quasi-steady-state approximation is able to produce only qualitative but not quantitative fit.

  11. Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes

    PubMed Central

    Zhang, Hong; Pei, Yun

    2016-01-01

    Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions. PMID:27529266

  12. Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes.

    PubMed

    Zhang, Hong; Pei, Yun

    2016-08-12

    Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions.

  13. Airfoil Shape Optimization based on Surrogate Model

    NASA Astrophysics Data System (ADS)

    Mukesh, R.; Lingadurai, K.; Selvakumar, U.

    2018-02-01

    Engineering design problems always require enormous amount of real-time experiments and computational simulations in order to assess and ensure the design objectives of the problems subject to various constraints. In most of the cases, the computational resources and time required per simulation are large. In certain cases like sensitivity analysis, design optimisation etc where thousands and millions of simulations have to be carried out, it leads to have a life time of difficulty for designers. Nowadays approximation models, otherwise called as surrogate models (SM), are more widely employed in order to reduce the requirement of computational resources and time in analysing various engineering systems. Various approaches such as Kriging, neural networks, polynomials, Gaussian processes etc are used to construct the approximation models. The primary intention of this work is to employ the k-fold cross validation approach to study and evaluate the influence of various theoretical variogram models on the accuracy of the surrogate model construction. Ordinary Kriging and design of experiments (DOE) approaches are used to construct the SMs by approximating panel and viscous solution algorithms which are primarily used to solve the flow around airfoils and aircraft wings. The method of coupling the SMs with a suitable optimisation scheme to carryout an aerodynamic design optimisation process for airfoil shapes is also discussed.

  14. Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework.

    PubMed

    Nahum-Shani, Inbal; Hekler, Eric B; Spruijt-Metz, Donna

    2015-12-01

    Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)-suites of interventions that adapt over time to an individual's changing status and circumstances with the goal to address the individual's need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely, the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  15. Incorporating Response Times in Item Response Theory Models of Reading Comprehension Fluency

    ERIC Educational Resources Information Center

    Su, Shiyang

    2017-01-01

    With the online assessment becoming mainstream and the recording of response times becoming straightforward, the importance of response times as a measure of psychological constructs has been recognized and the literature of modeling times has been growing during the last few decades. Previous studies have tried to formulate models and theories to…

  16. Simple and Hierarchical Models for Stochastic Test Misgrading.

    ERIC Educational Resources Information Center

    Wang, Jianjun

    1993-01-01

    Test misgrading is treated as a stochastic process. The expected number of misgradings, inter-occurrence time of misgradings, and waiting time for the "n"th misgrading are discussed based on a simple Poisson model and a hierarchical Beta-Poisson model. Examples of model construction are given. (SLD)

  17. On the deduction of chemical reaction pathways from measurements of time series of concentrations.

    PubMed

    Samoilov, Michael; Arkin, Adam; Ross, John

    2001-03-01

    We discuss the deduction of reaction pathways in complex chemical systems from measurements of time series of chemical concentrations of reacting species. First we review a technique called correlation metric construction (CMC) and show the construction of a reaction pathway from measurements on a part of glycolysis. Then we present two new improved methods for the analysis of time series of concentrations, entropy metric construction (EMC), and entropy reduction method (ERM), and illustrate (EMC) with calculations on a model reaction system. (c) 2001 American Institute of Physics.

  18. Persistent Monitoring of Urban Infrasound Phenomenology. Report 1: Modeling an Urban Environment for Acoustical Analyses using the 3-D Finite-Difference Time-Domain Program PSTOP3D

    DTIC Science & Technology

    2015-08-01

    a definition of the building footprints with an associated height attribute. This would be enough information to construct an extruded building ...contract was undertaken with CyberCity 3D (2012), a commercial company located in El Segundo, CA, to construct the buildings . The company specializes in...visualization of planning functions. CyberCity 3D used 6 in. stereoscopic aerial photography to construct 3-D models of the buildings . These models were quite

  19. Ecological and evolutionary consequences of niche construction for its agent.

    PubMed

    Kylafis, Grigoris; Loreau, Michel

    2008-10-01

    Niche construction can generate ecological and evolutionary feedbacks that have been underinvestigated so far. We present an eco-evolutionary model that incorporates the process of niche construction to reveal its effects on the ecology and evolution of the niche-constructing agent. We consider a simple plant-soil nutrient ecosystem in which plants have the ability to increase the input of inorganic nutrient as an example of positive niche construction. On an ecological time scale, the model shows that niche construction allows the persistence of plants under infertile soil conditions that would otherwise lead to their extinction. This expansion of plants' niche, however, requires a high enough rate of niche construction and a high enough initial plant biomass to fuel the positive ecological feedback between plants and their soil environment. On an evolutionary time scale, we consider that the rates of niche construction and nutrient uptake coevolve in plants while a trade-off constrains their values. Different evolutionary outcomes are possible depending on the shape of the trade-off. We show that niche construction results in an evolutionary feedback between plants and their soil environment such that plants partially regulate soil nutrient content. The direct benefit accruing to plants, however, plays a crucial role in the evolutionary advantage of niche construction.

  20. Construction schedules slack time minimizing

    NASA Astrophysics Data System (ADS)

    Krzemiński, Michał

    2017-07-01

    The article presents two copyright models for minimizing downtime working brigades. Models have been developed for construction schedules performed using the method of work uniform. Application of flow shop models is possible and useful for the implementation of large objects, which can be divided into plots. The article also presents a condition describing gives which model should be used, as well as a brief example of optimization schedule. The optimization results confirm the legitimacy of the work on the newly-developed models.

  1. Regenerating time series from ordinal networks.

    PubMed

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  2. Regenerating time series from ordinal networks

    NASA Astrophysics Data System (ADS)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  3. Development of three-dimensional patient face model that enables real-time collision detection and cutting operation for a dental simulator.

    PubMed

    Yamaguchi, Satoshi; Yamada, Yuya; Yoshida, Yoshinori; Noborio, Hiroshi; Imazato, Satoshi

    2012-01-01

    The virtual reality (VR) simulator is a useful tool to develop dental hand skill. However, VR simulations with reactions of patients have limited computational time to reproduce a face model. Our aim was to develop a patient face model that enables real-time collision detection and cutting operation by using stereolithography (STL) and deterministic finite automaton (DFA) data files. We evaluated dependence of computational cost and constructed the patient face model using the optimum condition for combining STL and DFA data files, and assessed the computational costs for operation in do-nothing, collision, cutting, and combination of collision and cutting. The face model was successfully constructed with low computational costs of 11.3, 18.3, 30.3, and 33.5 ms for do-nothing, collision, cutting, and collision and cutting, respectively. The patient face model could be useful for developing dental hand skill with VR.

  4. Cloud Computing: A model Construct of Real-Time Monitoring for Big Dataset Analytics Using Apache Spark

    NASA Astrophysics Data System (ADS)

    Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer

    2018-01-01

    The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.

  5. Model and Algorithm for Substantiating Solutions for Organization of High-Rise Construction Project

    NASA Astrophysics Data System (ADS)

    Anisimov, Vladimir; Anisimov, Evgeniy; Chernysh, Anatoliy

    2018-03-01

    In the paper the models and the algorithm for the optimal plan formation for the organization of the material and logistical processes of the high-rise construction project and their financial support are developed. The model is based on the representation of the optimization procedure in the form of a non-linear problem of discrete programming, which consists in minimizing the execution time of a set of interrelated works by a limited number of partially interchangeable performers while limiting the total cost of performing the work. The proposed model and algorithm are the basis for creating specific organization management methodologies for the high-rise construction project.

  6. The effectiveness of a new algorithm on a three-dimensional finite element model construction of bone trabeculae in implant biomechanics.

    PubMed

    Sato, Y; Teixeira, E R; Tsuga, K; Shindoi, N

    1999-08-01

    More validity of finite element analysis (FEA) in implant biomechanics requires element downsizing. However, excess downsizing needs computer memory and calculation time. To evaluate the effectiveness of a new algorithm established for more valid FEA model construction without downsizing, three-dimensional FEA bone trabeculae models with different element sizes (300, 150 and 75 micron) were constructed. Four algorithms of stepwise (1 to 4 ranks) assignment of Young's modulus accorded with bone volume in the individual cubic element was used and then stress distribution against vertical loading was analysed. The model with 300 micron element size, with 4 ranks of Young's moduli accorded with bone volume in each element presented similar stress distribution to the model with the 75 micron element size. These results show that the new algorithm was effective, and the use of the 300 micron element for bone trabeculae representation was proposed, without critical changes in stress values and for possible savings on computer memory and calculation time in the laboratory.

  7. USA National Phenology Network's volunteer-contributed observations yield predictive models of phenological transitions.

    PubMed

    Crimmins, Theresa M; Crimmins, Michael A; Gerst, Katharine L; Rosemartin, Alyssa H; Weltzin, Jake F

    2017-01-01

    In support of science and society, the USA National Phenology Network (USA-NPN) maintains a rapidly growing, continental-scale, species-rich dataset of plant and animal phenology observations that with over 10 million records is the largest such database in the United States. The aim of this study was to explore the potential that exists in the broad and rich volunteer-collected dataset maintained by the USA-NPN for constructing models predicting the timing of phenological transition across species' ranges within the continental United States. Contributed voluntarily by professional and citizen scientists, these opportunistically collected observations are characterized by spatial clustering, inconsistent spatial and temporal sampling, and short temporal depth (2009-present). Whether data exhibiting such limitations can be used to develop predictive models appropriate for use across large geographic regions has not yet been explored. We constructed predictive models for phenophases that are the most abundant in the database and also relevant to management applications for all species with available data, regardless of plant growth habit, location, geographic extent, or temporal depth of the observations. We implemented a very basic model formulation-thermal time models with a fixed start date. Sufficient data were available to construct 107 individual species × phenophase models. Remarkably, given the limited temporal depth of this dataset and the simple modeling approach used, fifteen of these models (14%) met our criteria for model fit and error. The majority of these models represented the "breaking leaf buds" and "leaves" phenophases and represented shrub or tree growth forms. Accumulated growing degree day (GDD) thresholds that emerged ranged from 454 GDDs (Amelanchier canadensis-breaking leaf buds) to 1,300 GDDs (Prunus serotina-open flowers). Such candidate thermal time thresholds can be used to produce real-time and short-term forecast maps of the timing of these phenophase transition. In addition, many of the candidate models that emerged were suitable for use across the majority of the species' geographic ranges. Real-time and forecast maps of phenophase transitions could support a wide range of natural resource management applications, including invasive plant management, issuing asthma and allergy alerts, and anticipating frost damage for crops in vulnerable states. Our finding that several viable thermal time threshold models that work across the majority of the species ranges could be constructed from the USA-NPN database provides clear evidence that great potential exists this dataset to develop more enhanced predictive models for additional species and phenophases. Further, the candidate models that emerged have immediate utility for supporting a wide range of management applications.

  8. Parent-Infant Synchrony and the Construction of Shared Timing; Physiological Precursors, Developmental Outcomes, and Risk Conditions

    ERIC Educational Resources Information Center

    Feldman, Ruth

    2007-01-01

    Synchrony, a construct used across multiple fields to denote the temporal relationship between events, is applied to the study of parent-infant interactions and suggested as a model for intersubjectivity. Three types of timed relationships between the parent and child's affective behavior are assessed: concurrent, sequential, and organized in an…

  9. Building Construction Progress Monitoring Using Unmanned Aerial System (uas), Low-Cost Photogrammetry, and Geographic Information System (gis)

    NASA Astrophysics Data System (ADS)

    Bognot, J. R.; Candido, C. G.; Blanco, A. C.; Montelibano, J. R. Y.

    2018-05-01

    Monitoring the progress of building's construction is critical in construction management. However, measuring the building construction's progress are still manual, time consuming, error prone, and impose tedious process of analysis leading to delays, additional costings and effort. The main goal of this research is to develop a methodology for building construction progress monitoring based on 3D as-built model of the building from unmanned aerial system (UAS) images, 4D as-planned model (with construction schedule integrated) and, GIS analysis. Monitoring was done by capturing videos of the building with a camera-equipped UAS. Still images were extracted, filtered, bundle-adjusted, and 3D as-built model was generated using open source photogrammetric software. The as-planned model was generated from digitized CAD drawings using GIS. The 3D as-built model was aligned with the 4D as-planned model of building formed from extrusion of building elements, and integration of the construction's planned schedule. The construction progress is visualized via color-coding the building elements in the 3D model. The developed methodology was conducted and applied from the data obtained from an actual construction site. Accuracy in detecting `built' or `not built' building elements ranges from 82-84 % and precision of 50-72 %. Quantified progress in terms of the number of building elements are 21.31% (November 2016), 26.84 % (January 2017) and 44.19 % (March 2017). The results can be used as an input for progress monitoring performance of construction projects and improving related decision-making process.

  10. [Establishment of endometriosis subcutaneous model in immunodeficient nude mice].

    PubMed

    Ni, H J; Zhang, Z; Dai, Y D; Zhang, S Y

    2016-09-06

    Objective: To establish a model of endometriosis in immunodeficient nude mice and compare the outcome of the model construction between two different techniques. Methods: Eighteen nude mice were divided into 2 groups, with 9 mice in each group. All nude mice received a subcutaneous transplantation of endometrial fragments, followed by sutured the wounded skin (sutured group) or not (no-sutured group). Then the success rate of the model construction, inflammation of the wounds and the animal survival rate in the two groups were analyzed. Result: In no-sutured group, the survival rate of animal and the success rate of the model construction were 9/9 and 8/9 respectively, with 8/9 survival rate and 7/9 success rate in sutured group. No significant difference was found between the two groups. And no obvious inflammation was presented in the wounds for both groups. Conclusion: It is an effective method to establish animal model of endometriosis by subcutaneous transplantation in nude mice. After transplantation, it does not affect the outcome of the survival rate of the animal and the success rate of the model construction whether we suture the wounded skin. Considering the shorter operation time, we found it's a simpler and time saving method to establish endometriosis by subcutaneously transplanting endometrial fragments in nude mice with no skin-sutured. And this model is worth of promotion.

  11. Equivalent model construction for a non-linear dynamic system based on an element-wise stiffness evaluation procedure and reduced analysis of the equivalent system

    NASA Astrophysics Data System (ADS)

    Kim, Euiyoung; Cho, Maenghyo

    2017-11-01

    In most non-linear analyses, the construction of a system matrix uses a large amount of computation time, comparable to the computation time required by the solving process. If the process for computing non-linear internal force matrices is substituted with an effective equivalent model that enables the bypass of numerical integrations and assembly processes used in matrix construction, efficiency can be greatly enhanced. A stiffness evaluation procedure (STEP) establishes non-linear internal force models using polynomial formulations of displacements. To efficiently identify an equivalent model, the method has evolved such that it is based on a reduced-order system. The reduction process, however, makes the equivalent model difficult to parameterize, which significantly affects the efficiency of the optimization process. In this paper, therefore, a new STEP, E-STEP, is proposed. Based on the element-wise nature of the finite element model, the stiffness evaluation is carried out element-by-element in the full domain. Since the unit of computation for the stiffness evaluation is restricted by element size, and since the computation is independent, the equivalent model can be constructed efficiently in parallel, even in the full domain. Due to the element-wise nature of the construction procedure, the equivalent E-STEP model is easily characterized by design parameters. Various reduced-order modeling techniques can be applied to the equivalent system in a manner similar to how they are applied in the original system. The reduced-order model based on E-STEP is successfully demonstrated for the dynamic analyses of non-linear structural finite element systems under varying design parameters.

  12. Self-consistent construction of virialized wave dark matter halos

    NASA Astrophysics Data System (ADS)

    Lin, Shan-Chang; Schive, Hsi-Yu; Wong, Shing-Kwong; Chiueh, Tzihong

    2018-05-01

    Wave dark matter (ψ DM ), which satisfies the Schrödinger-Poisson equation, has recently attracted substantial attention as a possible dark matter candidate. Numerical simulations have, in the past, provided a powerful tool to explore this new territory of possibility. Despite their successes in revealing several key features of ψ DM , further progress in simulations is limited, in that cosmological simulations so far can only address formation of halos below ˜2 ×1011 M⊙ and substantially more massive halos have become computationally very challenging to obtain. For this reason, the present work adopts a different approach in assessing massive halos by constructing wave-halo solutions directly from the wave distribution function. This approach bears certain similarities with the analytical construction of the particle-halo (cold dark matter model). Instead of many collisionless particles, one deals with one single wave that has many noninteracting eigenstates. The key ingredient in the wave-halo construction is the distribution function of the wave power, and we use several halos produced by structure formation simulations as templates to determine the wave distribution function. Among different models, we find the fermionic King model presents the best fits and we use it for our wave-halo construction. We have devised an iteration method for constructing the nonlinear halo and demonstrate its stability by three-dimensional simulations. A Milky Way-sized halo has also been constructed, and the inner halo is found to be flatter than the NFW profile. These wave-halos have small-scale interferences both in space and time producing time-dependent granules. While the spatial scale of granules varies little, the correlation time is found to increase with radius by 1 order of magnitude across the halo.

  13. Finite Element Analysis and Biomechanical Comparison of Short Posterior Spinal Instrumentation with Divergent Bridge Construct versus Parallel Tension Band Construct for Thoracolumbar Spine Fractures

    PubMed Central

    Ouellet, Jean A.; Richards, Corey; Sardar, Zeeshan M.; Giannitsios, Demetri; Noiseux, Nicholas; Strydom, Willem S.; Reindl, Rudy; Jarzem, Peter; Arlet, Vincent; Steffen, Thomas

    2013-01-01

    The ideal treatment for unstable thoracolumbar fractures remains controversial with posterior reduction and stabilization, anterior reduction and stabilization, combined posterior and anterior reduction and stabilization, and even nonoperative management advocated. Short segment posterior osteosynthesis of these fractures has less comorbidities compared with the other operative approaches but settles into kyphosis over time. Biomechanical comparison of the divergent bridge construct versus the parallel tension band construct was performed for anteriorly destabilized T11–L1 spine segments using three different models: (1) finite element analysis (FEA), (2) a synthetic model, and (3) a human cadaveric model. Outcomes measured were construct stiffness and ultimate failure load. Our objective was to determine if the divergent pedicle screw bridge construct would provide more resistance to kyphotic deforming forces. All three modalities showed greater stiffness with the divergent bridge construct. The FEA calculated a stiffness of 21.6 N/m for the tension band construct versus 34.1 N/m for the divergent bridge construct. The synthetic model resulted in a mean stiffness of 17.3 N/m for parallel tension band versus 20.6 N/m for the divergent bridge (p = 0.03), whereas the cadaveric model had an average stiffness of 15.2 N/m in the parallel tension band compared with 18.4 N/m for the divergent bridge (p = 0.02). Ultimate failure load with the cadaveric model was found to be 622 N for the divergent bridge construct versus 419 N (p = 0.15) for the parallel tension band construct. This study confirms our clinical experience that the short posterior divergent bridge construct provides greater stiffness for the management of unstable thoracolumbar fractures. PMID:24436856

  14. Trivariate Modeling of Interparental Conflict and Adolescent Emotional Security: An Examination of Mother-Father-Child Dynamics.

    PubMed

    Cheung, Rebecca Y M; Cummings, E Mark; Zhang, Zhiyong; Davies, Patrick T

    2016-11-01

    Recognizing the significance of interacting family subsystems, the present study addresses how interparental conflict is linked to adolescent emotional security as a function of parental gender. A total of 272 families with a child at 12.60 years of age (133 boys, 139 girls) were invited to participate each year for three consecutive years. A multi-informant method was used, along with trivariate models to test the associations among mothers, fathers, and their adolescent children's behaviors. The findings from separate models of destructive and constructive interparental conflict revealed intricate linkages among family members. In the model of destructive interparental conflict, mothers and fathers predicted each other's conflict behaviors over time. Moreover, adolescents' exposure to negativity expressed by either parent dampened their emotional security. Consistent with child effects models, adolescent emotional insecurity predicted fathers' destructive conflict behaviors. As for the model of constructive interparental conflict, fathers predicted mothers' conflict behaviors over time. Adolescents' exposure to fathers' constructive conflict behaviors also enhanced their sense of emotional security. Consistent with child effects models, adolescent emotional security predicted mothers' and fathers' constructive conflict behaviors. These findings extended the family and the adolescent literature by indicating that family processes are multiidirectional, involving multiple dyads in the study of parents' and adolescents' functioning. Contributions of these findings to the understanding of interparental conflict and emotional security in adolescence are discussed.

  15. A NEW METHOD OF LONGITUDINAL DIARY ASSEMBLY FOR EXPOSURE MODELING

    EPA Science Inventory

    Many stochastic human exposure models require the construction of longitudinal time-activity diaries to evaluate the time sequence of concentrations encountered, and hence, the pollutant exposure for the simulated individuals. However, most of the available data on human activiti...

  16. A higher-order Skyrme model

    NASA Astrophysics Data System (ADS)

    Gudnason, Sven Bjarke; Nitta, Muneto

    2017-09-01

    We propose a higher-order Skyrme model with derivative terms of eighth, tenth and twelfth order. Our construction yields simple and easy-to-interpret higher-order Lagrangians. We first show that a Skyrmion with higher-order terms proposed by Marleau has an instability in the form of a baby-Skyrmion string, while the static energies of our construction are positive definite, implying stability against time-independent perturbations. However, we also find that the Hamiltonians of our construction possess two kinds of dynamical instabilities, which may indicate the instability with respect to time-dependent perturbations. Different from the well-known Ostrogradsky instability, the instabilities that we find are intrinsically of nonlinear nature and also due to the fact that even powers of the inverse metric gives a ghost-like higher-order kinetic-like term. The vacuum state is, however, stable. Finally, we show that at sufficiently low energies, our Hamiltonians in the simplest cases, are stable against time-dependent perturbations.

  17. An algebraic method for constructing stable and consistent autoregressive filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, University Park, PA 16802; Hong, Hoon, E-mail: hong@ncsu.edu

    2015-02-15

    In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides amore » discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern.« less

  18. Predicting Condom Use Using the Information-Motivation-Behavioral Skills (IMB) Model: A Multivariate Latent Growth Curve Analysis

    PubMed Central

    Senn, Theresa E.; Scott-Sheldon, Lori A. J.; Vanable, Peter A.; Carey, Michael P.

    2011-01-01

    Background The Information-Motivation-Behavioral Skills (IMB) model often guides sexual risk reduction programs even though no studies have examined covariation in the theory’s constructs in a dynamic fashion with longitudinal data. Purpose Using new developments in latent growth modeling, we explore how changes in information, motivation, and behavioral skills over 9 months relate to changes in condom use among STD clinic patients. Methods Participants (N = 1281, 50% female, 66% African American) completed measures of IMB constructs at three time points. We used parallel process latent growth modeling to examine associations among intercepts and slopes of IMB constructs. Results Initial levels of motivation, behavioral skills, and condom use were all positively associated, with behavioral skills partially mediating associations between motivation and condom use. Changes over time in behavioral skills positively related to changes in condom use. Conclusions Results support the key role of behavioral skills in sexual risk reduction, suggesting these skills should be targeted in HIV prevention interventions. PMID:21638196

  19. Future climate data from RCP 4.5 and occurrence of malaria in Korea.

    PubMed

    Kwak, Jaewon; Noh, Huiseong; Kim, Soojun; Singh, Vijay P; Hong, Seung Jin; Kim, Duckgil; Lee, Keonhaeng; Kang, Narae; Kim, Hung Soo

    2014-10-15

    Since its reappearance at the Military Demarcation Line in 1993, malaria has been occurring annually in Korea. Malaria is regarded as a third grade nationally notifiable disease susceptible to climate change. The objective of this study is to quantify the effect of climatic factors on the occurrence of malaria in Korea and construct a malaria occurrence model for predicting the future trend of malaria under the influence of climate change. Using data from 2001-2011, the effect of time lag between malaria occurrence and mean temperature, relative humidity and total precipitation was investigated using spectral analysis. Also, a principal component regression model was constructed, considering multicollinearity. Future climate data, generated from RCP 4.5 climate change scenario and CNCM3 climate model, was applied to the constructed regression model to simulate future malaria occurrence and analyze the trend of occurrence. Results show an increase in the occurrence of malaria and the shortening of annual time of occurrence in the future.

  20. Future Climate Data from RCP 4.5 and Occurrence of Malaria in Korea

    PubMed Central

    Kwak, Jaewon; Noh, Huiseong; Kim, Soojun; Singh, Vijay P.; Hong, Seung Jin; Kim, Duckgil; Lee, Keonhaeng; Kang, Narae; Kim, Hung Soo

    2014-01-01

    Since its reappearance at the Military Demarcation Line in 1993, malaria has been occurring annually in Korea. Malaria is regarded as a third grade nationally notifiable disease susceptible to climate change. The objective of this study is to quantify the effect of climatic factors on the occurrence of malaria in Korea and construct a malaria occurrence model for predicting the future trend of malaria under the influence of climate change. Using data from 2001–2011, the effect of time lag between malaria occurrence and mean temperature, relative humidity and total precipitation was investigated using spectral analysis. Also, a principal component regression model was constructed, considering multicollinearity. Future climate data, generated from RCP 4.5 climate change scenario and CNCM3 climate model, was applied to the constructed regression model to simulate future malaria occurrence and analyze the trend of occurrence. Results show an increase in the occurrence of malaria and the shortening of annual time of occurrence in the future. PMID:25321875

  1. Vel-IO 3D: A tool for 3D velocity model construction, optimization and time-depth conversion in 3D geological modeling workflow

    NASA Astrophysics Data System (ADS)

    Maesano, Francesco E.; D'Ambrogi, Chiara

    2017-02-01

    We present Vel-IO 3D, a tool for 3D velocity model creation and time-depth conversion, as part of a workflow for 3D model building. The workflow addresses the management of large subsurface dataset, mainly seismic lines and well logs, and the construction of a 3D velocity model able to describe the variation of the velocity parameters related to strong facies and thickness variability and to high structural complexity. Although it is applicable in many geological contexts (e.g. foreland basins, large intermountain basins), it is particularly suitable in wide flat regions, where subsurface structures have no surface expression. The Vel-IO 3D tool is composed by three scripts, written in Python 2.7.11, that automate i) the 3D instantaneous velocity model building, ii) the velocity model optimization, iii) the time-depth conversion. They determine a 3D geological model that is consistent with the primary geological constraints (e.g. depth of the markers on wells). The proposed workflow and the Vel-IO 3D tool have been tested, during the EU funded Project GeoMol, by the construction of the 3D geological model of a flat region, 5700 km2 in area, located in the central part of the Po Plain. The final 3D model showed the efficiency of the workflow and Vel-IO 3D tool in the management of large amount of data both in time and depth domain. A 4 layer-cake velocity model has been applied to a several thousand (5000-13,000 m) thick succession, with 15 horizons from Triassic up to Pleistocene, complicated by a Mesozoic extensional tectonics and by buried thrusts related to Southern Alps and Northern Apennines.

  2. A simplified method for power-law modelling of metabolic pathways from time-course data and steady-state flux profiles.

    PubMed

    Kitayama, Tomoya; Kinoshita, Ayako; Sugimoto, Masahiro; Nakayama, Yoichi; Tomita, Masaru

    2006-07-17

    In order to improve understanding of metabolic systems there have been attempts to construct S-system models from time courses. Conventionally, non-linear curve-fitting algorithms have been used for modelling, because of the non-linear properties of parameter estimation from time series. However, the huge iterative calculations required have hindered the development of large-scale metabolic pathway models. To solve this problem we propose a novel method involving power-law modelling of metabolic pathways from the Jacobian of the targeted system and the steady-state flux profiles by linearization of S-systems. The results of two case studies modelling a straight and a branched pathway, respectively, showed that our method reduced the number of unknown parameters needing to be estimated. The time-courses simulated by conventional kinetic models and those described by our method behaved similarly under a wide range of perturbations of metabolite concentrations. The proposed method reduces calculation complexity and facilitates the construction of large-scale S-system models of metabolic pathways, realizing a practical application of reverse engineering of dynamic simulation models from the Jacobian of the targeted system and steady-state flux profiles.

  3. Simple Deterministically Constructed Recurrent Neural Networks

    NASA Astrophysics Data System (ADS)

    Rodan, Ali; Tiňo, Peter

    A large number of models for time series processing, forecasting or modeling follows a state-space formulation. Models in the specific class of state-space approaches, referred to as Reservoir Computing, fix their state-transition function. The state space with the associated state transition structure forms a reservoir, which is supposed to be sufficiently complex so as to capture a large number of features of the input stream that can be potentially exploited by the reservoir-to-output readout mapping. The largely "black box" character of reservoirs prevents us from performing a deeper theoretical investigation of the dynamical properties of successful reservoirs. Reservoir construction is largely driven by a series of (more-or-less) ad-hoc randomized model building stages, with both the researchers and practitioners having to rely on a series of trials and errors. We show that a very simple deterministically constructed reservoir with simple cycle topology gives performances comparable to those of the Echo State Network (ESN) on a number of time series benchmarks. Moreover, we argue that the memory capacity of such a model can be made arbitrarily close to the proved theoretical limit.

  4. Engineering fibrin-based tissue constructs from myofibroblasts and application of constraints and strain to induce cell and collagen reorganization.

    PubMed

    de Jonge, Nicky; Baaijens, Frank P T; Bouten, Carlijn V C

    2013-10-28

    Collagen content and organization in developing collagenous tissues can be influenced by local tissue strains and tissue constraint. Tissue engineers aim to use these principles to create tissues with predefined collagen architectures. A full understanding of the exact underlying processes of collagen remodeling to control the final tissue architecture, however, is lacking. In particular, little is known about the (re)orientation of collagen fibers in response to changes in tissue mechanical loading conditions. We developed an in vitro model system, consisting of biaxially-constrained myofibroblast-seeded fibrin constructs, to further elucidate collagen (re)orientation in response to i) reverting biaxial to uniaxial static loading conditions and ii) cyclic uniaxial loading of the biaxially-constrained constructs before and after a change in loading direction, with use of the Flexcell FX4000T loading device. Time-lapse confocal imaging is used to visualize collagen (re)orientation in a nondestructive manner. Cell and collagen organization in the constructs can be visualized in real-time, and an internal reference system allows us to relocate cells and collagen structures for time-lapse analysis. Various aspects of the model system can be adjusted, like cell source or use of healthy and diseased cells. Additives can be used to further elucidate mechanisms underlying collagen remodeling, by for example adding MMPs or blocking integrins. Shape and size of the construct can be easily adapted to specific needs, resulting in a highly tunable model system to study cell and collagen (re)organization.

  5. 3-compartment talaporfin sodium pharmacokinetic model by optimization using fluorescence measurement data from canine skin to estimate the concentration in interstitial space

    NASA Astrophysics Data System (ADS)

    Uno, Yuko; Ogawa, Emiyu; Aiyoshi, Eitaro; Arai, Tsunenori

    2018-02-01

    We constructed the 3-compartment talaporfin sodium pharmacokinetic model for canine by an optimization using the fluorescence measurement data from canine skin to estimate the concentration in the interstitial space. It is difficult to construct the 3-compartment model consisted of plasma, interstitial space, and cell because there is a lack of the dynamic information. Therefore, we proposed the methodology to construct the 3-compartment model using the measured talaporfin sodium skin fluorescence change considering originated tissue part by a histological observation. In a canine animal experiment, the talaporfin sodium concentration time history in plasma was measured by a spectrophotometer with a prepared calibration curve. The time history of talaporfin sodium Q-band fluorescence on left femoral skin of a beagle dog excited by talaporfin sodium Soret-band of 409 nm was measured in vivo by our previously constructed measurement system. The measured skin fluorescence was classified to its source, that is, specific ratio of plasma, interstitial space, and cell. We represented differential rate equations of the talaporfin sodium concentration in plasma, interstitial space, cell. The specific ratios and a converting constant to obtain absolute value of skin concentration were arranged. Minimizing the squared error of the difference between the measured fluorescence data and calculated concentration by the conjugate gradient method in MATLAB, the rate constants in the 3-compartment model were determined. The accuracy of the fitting operation was confirmed with determination coefficient of 0.98. We could construct the 3-compartment pharmacokinetic model for canine using the measured talaporfin sodium fluorescence change from canine skin.

  6. Perfectionism, procrastination, and psychological distress.

    PubMed

    Rice, Kenneth G; Richardson, Clarissa M E; Clark, Dustin

    2012-04-01

    Using a cross-panel design and data from 2 successive cohorts of college students (N = 357), we examined the stability of maladaptive perfectionism, procrastination, and psychological distress across 3 time points within a college semester. Each construct was substantially stable over time, with procrastination being especially stable. We also tested, but failed to support, a mediational model with Time 2 (mid-semester) procrastination as a hypothesized mechanism through which Time 1 (early-semester) perfectionism would affect Time 3 (end-semester) psychological distress. An alternative model with Time 2 perfectionism as a mediator of the procrastination-distress association also was not supported. Within-time analyses revealed generally consistent strength of effects in the correlations between the 3 constructs over the course of the semester. A significant interaction effect also emerged. Time 1 procrastination had no effect on otherwise high levels of psychological distress at the end of the semester for highly perfectionistic students, but at low levels of Time 1 perfectionism, the most distressed students by the end of the term were those who were more likely to have procrastinated earlier in the semester. Implications of the stability of the constructs and their association over time, as well as the moderating effects of procrastination, are discussed in the context of maladaptive perfectionism and problematic procrastination.

  7. Meta-Modeling: A Knowledge-Based Approach to Facilitating Model Construction and Reuse

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Dungan, Jennifer L.

    1997-01-01

    In this paper, we introduce a new modeling approach called meta-modeling and illustrate its practical applicability to the construction of physically-based ecosystem process models. As a critical adjunct to modeling codes meta-modeling requires explicit specification of certain background information related to the construction and conceptual underpinnings of a model. This information formalizes the heretofore tacit relationship between the mathematical modeling code and the underlying real-world phenomena being investigated, and gives insight into the process by which the model was constructed. We show how the explicit availability of such information can make models more understandable and reusable and less subject to misinterpretation. In particular, background information enables potential users to better interpret an implemented ecosystem model without direct assistance from the model author. Additionally, we show how the discipline involved in specifying background information leads to improved management of model complexity and fewer implementation errors. We illustrate the meta-modeling approach in the context of the Scientists' Intelligent Graphical Modeling Assistant (SIGMA) a new model construction environment. As the user constructs a model using SIGMA the system adds appropriate background information that ties the executable model to the underlying physical phenomena under investigation. Not only does this information improve the understandability of the final model it also serves to reduce the overall time and programming expertise necessary to initially build and subsequently modify models. Furthermore, SIGMA's use of background knowledge helps eliminate coding errors resulting from scientific and dimensional inconsistencies that are otherwise difficult to avoid when building complex models. As a. demonstration of SIGMA's utility, the system was used to reimplement and extend a well-known forest ecosystem dynamics model: Forest-BGC.

  8. Principal Component Analysis in Construction of 3D Human Knee Joint Models Using a Statistical Shape Model Method

    PubMed Central

    Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan

    2013-01-01

    The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the 3D joint surface model has been reported in literature. In this study, we constructed a SSM database using 152 human CT knee joint models, including the femur, tibia and patella and analyzed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 seconds using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus it may have a broad application in computer assisted knee surgeries that require 3D surface models of the knee. PMID:24156375

  9. USA National Phenology Network’s volunteer-contributed observations yield predictive models of phenological transitions

    PubMed Central

    Crimmins, Michael A.; Gerst, Katharine L.; Rosemartin, Alyssa H.; Weltzin, Jake F.

    2017-01-01

    Purpose In support of science and society, the USA National Phenology Network (USA-NPN) maintains a rapidly growing, continental-scale, species-rich dataset of plant and animal phenology observations that with over 10 million records is the largest such database in the United States. The aim of this study was to explore the potential that exists in the broad and rich volunteer-collected dataset maintained by the USA-NPN for constructing models predicting the timing of phenological transition across species’ ranges within the continental United States. Contributed voluntarily by professional and citizen scientists, these opportunistically collected observations are characterized by spatial clustering, inconsistent spatial and temporal sampling, and short temporal depth (2009-present). Whether data exhibiting such limitations can be used to develop predictive models appropriate for use across large geographic regions has not yet been explored. Methods We constructed predictive models for phenophases that are the most abundant in the database and also relevant to management applications for all species with available data, regardless of plant growth habit, location, geographic extent, or temporal depth of the observations. We implemented a very basic model formulation—thermal time models with a fixed start date. Results Sufficient data were available to construct 107 individual species × phenophase models. Remarkably, given the limited temporal depth of this dataset and the simple modeling approach used, fifteen of these models (14%) met our criteria for model fit and error. The majority of these models represented the “breaking leaf buds” and “leaves” phenophases and represented shrub or tree growth forms. Accumulated growing degree day (GDD) thresholds that emerged ranged from 454 GDDs (Amelanchier canadensis-breaking leaf buds) to 1,300 GDDs (Prunus serotina-open flowers). Such candidate thermal time thresholds can be used to produce real-time and short-term forecast maps of the timing of these phenophase transition. In addition, many of the candidate models that emerged were suitable for use across the majority of the species’ geographic ranges. Real-time and forecast maps of phenophase transitions could support a wide range of natural resource management applications, including invasive plant management, issuing asthma and allergy alerts, and anticipating frost damage for crops in vulnerable states. Implications Our finding that several viable thermal time threshold models that work across the majority of the species ranges could be constructed from the USA-NPN database provides clear evidence that great potential exists this dataset to develop more enhanced predictive models for additional species and phenophases. Further, the candidate models that emerged have immediate utility for supporting a wide range of management applications. PMID:28829783

  10. Building Information Modelling (BIM) and Unmanned Aerial Vehicle (UAV) technologies in infrastructure construction project management and delay and disruption analysis

    NASA Astrophysics Data System (ADS)

    Vacanas, Yiannis; Themistocleous, Kyriacos; Agapiou, Athos; Hadjimitsis, Diofantos

    2015-06-01

    Time in infrastructure construction projects has always been a fundamental issue as early as from the inception of a project, during the construction process and often after the completion and delivery. In a typical construction contract time related matters such as the completion date and possible delays are among the most important issues that are dealt with by the contract provisions. In the event of delay there are usually provisions for extension of time award to the contractor with possible reimbursement for the extra cost and expenses caused by this extension of time to the contract duration. In the case the contractor is not entitled to extension of time, the owner will be possibly entitled to amounts as compensation for the time prohibited from using his development. Even in the event of completion within the time agreed, under certain circumstances a contractor may have claims for reimbursement for extra costs incurred due to induced acceleration measures he had to take in order to mitigate disruption effects caused to the progress of the works by the owner or his representatives. Depending on the size of the project and the agreement amount, these reimbursement sums may be extremely high. Therefore innovative methods with the exploitation of new technologies for effective project management for the avoidance of delays, delay analysis and mitigation measures are essential; moreover, methods for collecting efficiently information during the construction process so that disputes regarding time are avoided or resolved in a quick and fair manner are required. This paper explores the state of art for existing use of Building Information Modelling (BIM) and Unmanned Aerial Vehicles (UAV) technologies in the construction industry in general. Moreover the paper considers the prospect of using BIM technology in conjunction with the use of UAV technology for efficient and accurate as-built data collection and illustration of the works progress during an infrastructure construction project in order to achieve more effective project management, record keeping and delay analysis.

  11. MODELING HOW A HURRICANE BARRIER IN NEW BEDFORD HARBOR, MASSACHUSETTS, AFFECTS THE HYDRODYNAMICS AND RESIDENCE TIMES

    EPA Science Inventory

    Two-dimensional hydrodynamic and transport models were used to simulate tidal and subtidal circulation, residence times, and the longitudinal distributions of conservative constituents in New Bedford Harbor, Massachusetts, before and after a hurricane barrier was constructed. The...

  12. Study of Collaborative Management for Transportation Construction Project Based on BIM Technology

    NASA Astrophysics Data System (ADS)

    Jianhua, Liu; Genchuan, Luo; Daiquan, Liu; Wenlei, Li; Bowen, Feng

    2018-03-01

    Abstract. Building Information Modeling(BIM) is a building modeling technology based on the relevant information data of the construction project. It is an advanced technology and management concept, which is widely used in the whole life cycle process of planning, design, construction and operation. Based on BIM technology, transportation construction project collaborative management can have better communication through authenticity simulation and architectural visualization and can obtain the basic and real-time information such as project schedule, engineering quality, cost and environmental impact etc. The main services of highway construction management are integrated on the unified BIM platform for collaborative management to realize information intercommunication and exchange, to change the isolated situation of information in the past, and improve the level of information management. The final BIM model is integrated not only for the information management of project and the integration of preliminary documents and design drawings, but also for the automatic generation of completion data and final accounts, which covers the whole life cycle of traffic construction projects and lays a good foundation for smart highway construction.

  13. On the mathematical analysis of Ebola hemorrhagic fever: deathly infection disease in West African countries.

    PubMed

    Atangana, Abdon; Goufo, Emile Franc Doungmo

    2014-01-01

    For a given West African country, we constructed a model describing the spread of the deathly disease called Ebola hemorrhagic fever. The model was first constructed using the classical derivative and then converted to the generalized version using the beta-derivative. We studied in detail the endemic equilibrium points and provided the Eigen values associated using the Jacobian method. We furthered our investigation by solving the model numerically using an iteration method. The simulations were done in terms of time and beta. The study showed that, for small portion of infected individuals, the whole country could die out in a very short period of time in case there is not good prevention.

  14. USE OF TRANS-CONTEXTUAL MODEL-BASED PHYSICAL ACTIVITY COURSE IN DEVELOPING LEISURE-TIME PHYSICAL ACTIVITY BEHAVIOR OF UNIVERSITY STUDENTS.

    PubMed

    Müftüler, Mine; İnce, Mustafa Levent

    2015-08-01

    This study examined how a physical activity course based on the Trans-Contextual Model affected the variables of perceived autonomy support, autonomous motivation, determinants of leisure-time physical activity behavior, basic psychological needs satisfaction, and leisure-time physical activity behaviors. The participants were 70 Turkish university students (M age=23.3 yr., SD=3.2). A pre-test-post-test control group design was constructed. Initially, the participants were randomly assigned into an experimental (n=35) and a control (n=35) group. The experimental group followed a 12 wk. trans-contextual model-based intervention. The participants were pre- and post-tested in terms of Trans-Contextual Model constructs and of self-reported leisure-time physical activity behaviors. Multivariate analyses showed significant increases over the 12 wk. period for perceived autonomy support from instructor and peers, autonomous motivation in leisure-time physical activity setting, positive intention and perceived behavioral control over leisure-time physical activity behavior, more fulfillment of psychological needs, and more engagement in leisure-time physical activity behavior in the experimental group. These results indicated that the intervention was effective in developing leisure-time physical activity and indicated that the Trans-Contextual Model is a useful way to conceptualize these relationships.

  15. Mobile Learning Model and Process Optimization in the Era of Fragmentation

    ERIC Educational Resources Information Center

    Zhang, Shi-Jun; Yu, Gui-Hua

    2017-01-01

    In the context of mobile Internet, college students' leisure time has fragmentation characteristics to improve the value of time, it is of great practical significance to make full use of fragmentation time to study effectively. This research focuses on mobile learning model and its effect, firstly, qualitative research is used to construct the…

  16. The choice of boundary conditions and mesh for scaffolding FEM model on the basis of natural vibrations measurements

    NASA Astrophysics Data System (ADS)

    Cyniak, Patrycja; Błazik-Borowa, Ewa; Szer, Jacek; Lipecki, Tomasz; Szer, Iwona

    2018-01-01

    Scaffolding is a specific construction with high susceptibility to low frequency vibrations. The numerical model of scaffolding presented in this paper contains real imperfections received from geodetic measurements of real construction. Boundary conditions were verified on the basis of measured free vibrations. A simulation of a man walking on penultimate working level as a dynamic load variable in time was made for verified model. The paper presents procedure for a choice of selected parameters of the scaffolding FEM model. The main aim of analysis is the best projection of the real construction and correct modeling of worker walking on the scaffolding. Different boundary conditions are considered, because of their impact on construction vibrations. Natural vibrations obtained from FEM calculations are compared with free vibrations measured during in-situ tests. Structure accelerations caused by walking human are then considered in this paper. Methodology of creating numerical models of scaffoldings and analysis of dynamic effects during human walking are starting points for further considerations about dynamic loads acting on such structures and effects of these loads to construction and workers, whose workplaces are situated on the scaffolding.

  17. Dynamic Factor Analysis of Nonstationary Multivariate Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; And Others

    1992-01-01

    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  18. Degradation data analysis based on a generalized Wiener process subject to measurement error

    NASA Astrophysics Data System (ADS)

    Li, Junxing; Wang, Zhihua; Zhang, Yongbo; Fu, Huimin; Liu, Chengrui; Krishnaswamy, Sridhar

    2017-09-01

    Wiener processes have received considerable attention in degradation modeling over the last two decades. In this paper, we propose a generalized Wiener process degradation model that takes unit-to-unit variation, time-correlated structure and measurement error into considerations simultaneously. The constructed methodology subsumes a series of models studied in the literature as limiting cases. A simple method is given to determine the transformed time scale forms of the Wiener process degradation model. Then model parameters can be estimated based on a maximum likelihood estimation (MLE) method. The cumulative distribution function (CDF) and the probability distribution function (PDF) of the Wiener process with measurement errors are given based on the concept of the first hitting time (FHT). The percentiles of performance degradation (PD) and failure time distribution (FTD) are also obtained. Finally, a comprehensive simulation study is accomplished to demonstrate the necessity of incorporating measurement errors in the degradation model and the efficiency of the proposed model. Two illustrative real applications involving the degradation of carbon-film resistors and the wear of sliding metal are given. The comparative results show that the constructed approach can derive a reasonable result and an enhanced inference precision.

  19. A recipe for consistent 3D management of velocity data and time-depth conversion using Vel-IO 3D

    NASA Astrophysics Data System (ADS)

    Maesano, Francesco E.; D'Ambrogi, Chiara

    2017-04-01

    3D geological model production and related basin analyses need large and consistent seismic dataset and hopefully well logs to support correlation and calibration; the workflow and tools used to manage and integrate different type of data control the soundness of the final 3D model. Even though seismic interpretation is a basic early step in such workflow, the most critical step to obtain a comprehensive 3D model useful for further analyses is represented by the construction of an effective 3D velocity model and a well constrained time-depth conversion. We present a complex workflow that includes comprehensive management of large seismic dataset and velocity data, the construction of a 3D instantaneous multilayer-cake velocity model, the time-depth conversion of highly heterogeneous geological framework, including both depositional and structural complexities. The core of the workflow is the construction of the 3D velocity model using Vel-IO 3D tool (Maesano and D'Ambrogi, 2017; https://github.com/framae80/Vel-IO3D) that is composed by the following three scripts, written in Python 2.7.11 under ArcGIS ArcPy environment: i) the 3D instantaneous velocity model builder creates a preliminary 3D instantaneous velocity model using key horizons in time domain and velocity data obtained from the analysis of well and pseudo-well logs. The script applies spatial interpolation to the velocity parameters and calculates the value of depth of each point on each horizon bounding the layer-cake velocity model. ii) the velocity model optimizer improves the consistency of the velocity model by adding new velocity data indirectly derived from measured depths, thus reducing the geometrical uncertainties in the areas located far from the original velocity data. iii) the time-depth converter runs the time-depth conversion of any object located inside the 3D velocity model The Vel-IO 3D tool allows one to create 3D geological models consistent with the primary geological constraints (e.g. depth of the markers on wells). The workflow and Vel-IO 3D tool have been developed and tested for the construction of the 3D geological model of a flat region, 5700 km2 in area, located in the central part of the Po Plain (Northern Italy) in the frame of the European funded Project GeoMol. The study area was covered by a dense dataset of seismic lines (ca. 12000 km) and exploration wells (130 drilling), mainly deriving from oil and gas exploration activities. The interpretation of the seismic dataset leads to the construction of a 3D model in time domain that has been depth converted using Vel-IO 3D, with a 4 layer-cake 3D instantaneous velocity model. The resulting final 3D geological model, composed of 15 horizons and 150 faults, has been used for basin analysis at regional scale, for geothermal assessment, and for the update of the seismotectonic knowledge of the Po Plain. The Vel-IO 3D has been further used for the depth conversion of the accretionary prism of the Calabrian subduction (Southern Italy) and for a basin scale analysis of the Po Plain Plio-Pleistocene evolution. Maesano F.E. and D'Ambrogi C., (2017), Computers and Geosciences, doi: 10.1016/j.cageo.2016.11.013 Vel-IO 3D is available at: https://github.com/framae80/Vel-IO3D

  20. Space logistics simulation: Launch-on-time

    NASA Technical Reports Server (NTRS)

    Nii, Kendall M.

    1990-01-01

    During 1989-1990 the Center for Space Construction developed the Launch-On-Time (L-O-T) Model to help asses and improve the likelihood of successfully supporting space construction requiring multi-logistic delivery flights. The model chose a reference by which the L-O-T probability and improvements to L-O-T probability can be judged. The measure of improvement was chosen as the percent reduction in E(S(sub N)), the total expected amount of unscheduled 'hold' time. We have also previously developed an approach to determining the reduction in E(S(sub N)) by reducing some of the causes of unscheduled holds and increasing the speed at which the problems causing the holds may be 'fixed.' We provided a mathematical (binary linear programming) model for measuring the percent reduction in E(S(sub N)) given such improvements. In this presentation we shall exercise the model which was developed and draw some conclusions about the following: methods used, data available and needed, and make suggestions for areas of improvement in 'real world' application of the model.

  1. Laboratory testing and finite element modeling of precast bridge deck panel transverse connections.

    DOT National Transportation Integrated Search

    2010-08-06

    Precast bridge deck panels are increasingly used to reduce construction times and associated traffic delays as part of many DOTs push for accelerated bridge construction. They allow a bridge deck to be built or replaced in days instead of months. ...

  2. Intelligent robots for planetary exploration and construction

    NASA Technical Reports Server (NTRS)

    Albus, James S.

    1992-01-01

    Robots capable of practical applications in planetary exploration and construction will require realtime sensory-interactive goal-directed control systems. A reference model architecture based on the NIST Real-time Control System (RCS) for real-time intelligent control systems is suggested. RCS partitions the control problem into four basic elements: behavior generation (or task decomposition), world modeling, sensory processing, and value judgment. It clusters these elements into computational nodes that have responsibility for specific subsystems, and arranges these nodes in hierarchical layers such that each layer has characteristic functionality and timing. Planetary exploration robots should have mobility systems that can safely maneuver over rough surfaces at high speeds. Walking machines and wheeled vehicles with dynamic suspensions are candidates. The technology of sensing and sensory processing has progressed to the point where real-time autonomous path planning and obstacle avoidance behavior is feasible. Map-based navigation systems will support long-range mobility goals and plans. Planetary construction robots must have high strength-to-weight ratios for lifting and positioning tools and materials in six degrees-of-freedom over large working volumes. A new generation of cable-suspended Stewart platform devices and inflatable structures are suggested for lifting and positioning materials and structures, as well as for excavation, grading, and manipulating a variety of tools and construction machinery.

  3. Results on a binding neuron model and their implications for modified hourglass model for neuronal network.

    PubMed

    Arunachalam, Viswanathan; Akhavan-Tabatabaei, Raha; Lopez, Cristina

    2013-01-01

    The classical models of single neuron like Hodgkin-Huxley point neuron or leaky integrate and fire neuron assume the influence of postsynaptic potentials to last till the neuron fires. Vidybida (2008) in a refreshing departure has proposed models for binding neurons in which the trace of an input is remembered only for a finite fixed period of time after which it is forgotten. The binding neurons conform to the behaviour of real neurons and are applicable in constructing fast recurrent networks for computer modeling. This paper develops explicitly several useful results for a binding neuron like the firing time distribution and other statistical characteristics. We also discuss the applicability of the developed results in constructing a modified hourglass network model in which there are interconnected neurons with excitatory as well as inhibitory inputs. Limited simulation results of the hourglass network are presented.

  4. Exploitation and Benefits of BIM in Construction Project Management

    NASA Astrophysics Data System (ADS)

    Mesároš, Peter; Mandičák, Tomáš

    2017-10-01

    BIM is increasingly getting into the awareness in construction industry. BIM is the process of creating and data managing of the building during its life cycle. BIM became a part of management tools in modern construction companies. Construction projects have a number of participants. It means difficulty process of construction project management and a serious requirement for processing the huge amount of information including design, construction, time and cost parameters, economic efficiency and sustainability. Progressive information and communication technologies support cost management and management of construction project. One of them is Building Information Modelling. Aim of the paper is to examine the impact of BIM exploitation and benefits on construction project management in Slovak companies.

  5. A Real-Time Construction Safety Monitoring System for Hazardous Gas Integrating Wireless Sensor Network and Building Information Modeling Technologies.

    PubMed

    Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng

    2018-02-02

    In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications.

  6. Non-resonant dynamic stark control of vibrational motion with optimized laser pulses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Esben F.; Henriksen, Niels E.

    2016-06-28

    The term dynamic Stark control (DSC) has been used to describe methods of quantum control related to the dynamic Stark effect, i.e., a time-dependent distortion of energy levels. Here, we employ analytical models that present clear and concise interpretations of the principles behind DSC. Within a linearly forced harmonic oscillator model of vibrational excitation, we show how the vibrational amplitude is related to the pulse envelope, and independent of the carrier frequency of the laser pulse, in the DSC regime. Furthermore, we shed light on the DSC regarding the construction of optimal pulse envelopes — from a time-domain as wellmore » as a frequency-domain perspective. Finally, in a numerical study beyond the linearly forced harmonic oscillator model, we show that a pulse envelope can be constructed such that a vibrational excitation into a specific excited vibrational eigenstate is accomplished. The pulse envelope is constructed such that high intensities are avoided in order to eliminate the process of ionization.« less

  7. Co Modeling and Co Synthesis of Safety Critical Multi threaded Embedded Software for Multi Core Embedded Platforms

    DTIC Science & Technology

    2017-03-20

    computation, Prime Implicates, Boolean Abstraction, real- time embedded software, software synthesis, correct by construction software design , model...types for time -dependent data-flow networks". J.-P. Talpin, P. Jouvelot, S. Shukla. ACM-IEEE Conference on Methods and Models for System Design ...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing   data sources, gathering and

  8. Electron Induced Discharge Modeling, Testing, and Analysis for Scatha. Volume I. Phenomenology Study and Model Testing.

    DTIC Science & Technology

    1978-12-31

    Dielectric Discharge. .. ......... 23 3.2.1 Total Emitted Charge .. ........... ........ 26 3.2.2 Emission Time History .. .. ................. 29 3.3...taken to be a rise time of 10 ns and a fall time of 10 to 100 ns. In addition, a physical model of the discharge mechanism has been developed in which...scale model of the P78-2, dubbed the SCATSAT was constructed whose design was chosen to simulate the basic structure of the real satellite, including the

  9. A multi-element cosmological model with a complex space-time topology

    NASA Astrophysics Data System (ADS)

    Kardashev, N. S.; Lipatova, L. N.; Novikov, I. D.; Shatskiy, A. A.

    2015-02-01

    Wormhole models with a complex topology having one entrance and two exits into the same space-time of another universe are considered, as well as models with two entrances from the same space-time and one exit to another universe. These models are used to build a model of a multi-sheeted universe (a multi-element model of the "Multiverse") with a complex topology. Spherical symmetry is assumed in all the models. A Reissner-Norström black-hole model having no singularity beyond the horizon is constructed. The strength of the central singularity of the black hole is analyzed.

  10. Using Simulation to Interpret a Discrete Time Survival Model in a Complex Biological System: Fertility and Lameness in Dairy Cows

    PubMed Central

    Hudson, Christopher D.; Huxley, Jonathan N.; Green, Martin J.

    2014-01-01

    The ever-growing volume of data routinely collected and stored in everyday life presents researchers with a number of opportunities to gain insight and make predictions. This study aimed to demonstrate the usefulness in a specific clinical context of a simulation-based technique called probabilistic sensitivity analysis (PSA) in interpreting the results of a discrete time survival model based on a large dataset of routinely collected dairy herd management data. Data from 12,515 dairy cows (from 39 herds) were used to construct a multilevel discrete time survival model in which the outcome was the probability of a cow becoming pregnant during a given two day period of risk, and presence or absence of a recorded lameness event during various time frames relative to the risk period amongst the potential explanatory variables. A separate simulation model was then constructed to evaluate the wider clinical implications of the model results (i.e. the potential for a herd’s incidence rate of lameness to influence its overall reproductive performance) using PSA. Although the discrete time survival analysis revealed some relatively large associations between lameness events and risk of pregnancy (for example, occurrence of a lameness case within 14 days of a risk period was associated with a 25% reduction in the risk of the cow becoming pregnant during that risk period), PSA revealed that, when viewed in the context of a realistic clinical situation, a herd’s lameness incidence rate is highly unlikely to influence its overall reproductive performance to a meaningful extent in the vast majority of situations. Construction of a simulation model within a PSA framework proved to be a very useful additional step to aid contextualisation of the results from a discrete time survival model, especially where the research is designed to guide on-farm management decisions at population (i.e. herd) rather than individual level. PMID:25101997

  11. Using simulation to interpret a discrete time survival model in a complex biological system: fertility and lameness in dairy cows.

    PubMed

    Hudson, Christopher D; Huxley, Jonathan N; Green, Martin J

    2014-01-01

    The ever-growing volume of data routinely collected and stored in everyday life presents researchers with a number of opportunities to gain insight and make predictions. This study aimed to demonstrate the usefulness in a specific clinical context of a simulation-based technique called probabilistic sensitivity analysis (PSA) in interpreting the results of a discrete time survival model based on a large dataset of routinely collected dairy herd management data. Data from 12,515 dairy cows (from 39 herds) were used to construct a multilevel discrete time survival model in which the outcome was the probability of a cow becoming pregnant during a given two day period of risk, and presence or absence of a recorded lameness event during various time frames relative to the risk period amongst the potential explanatory variables. A separate simulation model was then constructed to evaluate the wider clinical implications of the model results (i.e. the potential for a herd's incidence rate of lameness to influence its overall reproductive performance) using PSA. Although the discrete time survival analysis revealed some relatively large associations between lameness events and risk of pregnancy (for example, occurrence of a lameness case within 14 days of a risk period was associated with a 25% reduction in the risk of the cow becoming pregnant during that risk period), PSA revealed that, when viewed in the context of a realistic clinical situation, a herd's lameness incidence rate is highly unlikely to influence its overall reproductive performance to a meaningful extent in the vast majority of situations. Construction of a simulation model within a PSA framework proved to be a very useful additional step to aid contextualisation of the results from a discrete time survival model, especially where the research is designed to guide on-farm management decisions at population (i.e. herd) rather than individual level.

  12. Modeling of water treatment plant using timed continuous Petri nets

    NASA Astrophysics Data System (ADS)

    Nurul Fuady Adhalia, H.; Subiono, Adzkiya, Dieky

    2017-08-01

    Petri nets represent graphically certain conditions and rules. In this paper, we construct a model of the Water Treatment Plant (WTP) using timed continuous Petri nets. Specifically, we consider that (1) the water pump always active and (2) the water source is always available. After obtaining the model, the flow through the transitions and token conservation laws are calculated.

  13. Estimating Multi-Level Discrete-Time Hazard Models Using Cross-Sectional Data: Neighborhood Effects on the Onset of Adolescent Cigarette Use.

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Brennan, Robert T.; Buka, Stephen L.

    2002-01-01

    Developed procedures for constructing a retrospective person-period data set from cross-sectional data and discusses modeling strategies for estimating multilevel discrete-time event history models. Applied the methods to the analysis of cigarette use by 1,979 urban adolescents. Results show the influence of the racial composition of the…

  14. Restoration of STORM images from sparse subset of localizations (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Moiseev, Alexander A.; Gelikonov, Grigory V.; Gelikonov, Valentine M.

    2016-02-01

    To construct a Stochastic Optical Reconstruction Microscopy (STORM) image one should collect sufficient number of localized fluorophores to satisfy Nyquist criterion. This requirement limits time resolution of the method. In this work we propose a probabalistic approach to construct STORM images from a subset of localized fluorophores 3-4 times sparser than required from Nyquist criterion. Using a set of STORM images constructed from number of localizations sufficient for Nyquist criterion we derive a model which allows us to predict the probability for every location to be occupied by a fluorophore at the end of hypothetical acquisition, having as an input parameters distribution of already localized fluorophores in the proximity of this location. We show that probability map obtained from number of fluorophores 3-4 times less than required by Nyquist criterion may be used as superresolution image itself. Thus we are able to construct STORM image from a subset of localized fluorophores 3-4 times sparser than required from Nyquist criterion, proportionaly decreasing STORM data acquisition time. This method may be used complementary with other approaches desined for increasing STORM time resolution.

  15. Principal component analysis in construction of 3D human knee joint models using a statistical shape model method.

    PubMed

    Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan

    2015-01-01

    The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the three-dimensional (3D) joint surface model has been reported in the literature. In this study, we constructed a SSM database using 152 human computed tomography (CT) knee joint models, including the femur, tibia and patella and analysed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 s using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus, it may have a broad application in computer-assisted knee surgeries that require 3D surface models of the knee.

  16. Real-Time Safety Risk Assessment Based on a Real-Time Location System for Hydropower Construction Sites

    PubMed Central

    Fan, Qixiang; Qiang, Maoshan

    2014-01-01

    The concern for workers' safety in construction industry is reflected in many studies focusing on static safety risk identification and assessment. However, studies on real-time safety risk assessment aimed at reducing uncertainty and supporting quick response are rare. A method for real-time safety risk assessment (RTSRA) to implement a dynamic evaluation of worker safety states on construction site has been proposed in this paper. The method provides construction managers who are in charge of safety with more abundant information to reduce the uncertainty of the site. A quantitative calculation formula, integrating the influence of static and dynamic hazards and that of safety supervisors, is established to link the safety risk of workers with the locations of on-site assets. By employing the hidden Markov model (HMM), the RTSRA provides a mechanism for processing location data provided by the real-time location system (RTLS) and analyzing the probability distributions of different states in terms of false positives and negatives. Simulation analysis demonstrated the logic of the proposed method and how it works. Application case shows that the proposed RTSRA is both feasible and effective in managing construction project safety concerns. PMID:25114958

  17. Real-time safety risk assessment based on a real-time location system for hydropower construction sites.

    PubMed

    Jiang, Hanchen; Lin, Peng; Fan, Qixiang; Qiang, Maoshan

    2014-01-01

    The concern for workers' safety in construction industry is reflected in many studies focusing on static safety risk identification and assessment. However, studies on real-time safety risk assessment aimed at reducing uncertainty and supporting quick response are rare. A method for real-time safety risk assessment (RTSRA) to implement a dynamic evaluation of worker safety states on construction site has been proposed in this paper. The method provides construction managers who are in charge of safety with more abundant information to reduce the uncertainty of the site. A quantitative calculation formula, integrating the influence of static and dynamic hazards and that of safety supervisors, is established to link the safety risk of workers with the locations of on-site assets. By employing the hidden Markov model (HMM), the RTSRA provides a mechanism for processing location data provided by the real-time location system (RTLS) and analyzing the probability distributions of different states in terms of false positives and negatives. Simulation analysis demonstrated the logic of the proposed method and how it works. Application case shows that the proposed RTSRA is both feasible and effective in managing construction project safety concerns.

  18. Integration of Five Health Behaviour Models: Common Strengths and Unique Contributions to Understanding Condom Use

    PubMed Central

    Reid, Allecia E.; Aiken, Leona S.

    2011-01-01

    The purpose of this research was to select from the health belief model (HBM), theories of reasoned action (TRA) and planned behaviour (TPB), information-motivation-behavioural skills model (IMB), and social cognitive theory (SCT) the strongest longitudinal predictors of women’s condom use and to combine these constructs into a single integrated model of condom use. The integrated model was evaluated for prediction of condom use among young women who had steady versus casual partners. At Time 1, all constructs of the five models and condom use were assessed in an initial and a replication sample (n= 193, n= 161). Condom use reassessed 8 weeks later (Time 2) served as the main outcome. Information from IMB, perceived susceptibility, benefits, and barriers from HBM, self-efficacy and self-evaluative expectancies from SCT, and partner norm and attitudes from TPB served as indirect or direct predictors of condom use. All paths replicated across samples. Direct predictors of behaviour varied with relationship status: self-efficacy significantly predicted condom use for women with casual partners, while attitude and partner norm predicted for those with steady partners. Integrated psychosocial models, rich in constructs and relationships drawn from multiple theories of behaviour, may provide a more complete characterization of health protective behaviour. PMID:21678166

  19. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  20. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  1. a Quadtree Organization Construction and Scheduling Method for Urban 3d Model Based on Weight

    NASA Astrophysics Data System (ADS)

    Yao, C.; Peng, G.; Song, Y.; Duan, M.

    2017-09-01

    The increasement of Urban 3D model precision and data quantity puts forward higher requirements for real-time rendering of digital city model. Improving the organization, management and scheduling of 3D model data in 3D digital city can improve the rendering effect and efficiency. This paper takes the complexity of urban models into account, proposes a Quadtree construction and scheduling rendering method for Urban 3D model based on weight. Divide Urban 3D model into different rendering weights according to certain rules, perform Quadtree construction and schedule rendering according to different rendering weights. Also proposed an algorithm for extracting bounding box extraction based on model drawing primitives to generate LOD model automatically. Using the algorithm proposed in this paper, developed a 3D urban planning&management software, the practice has showed the algorithm is efficient and feasible, the render frame rate of big scene and small scene are both stable at around 25 frames.

  2. A real-time prediction model for post-irradiation malignant cervical lymph nodes.

    PubMed

    Lo, W-C; Cheng, P-W; Shueng, P-W; Hsieh, C-H; Chang, Y-L; Liao, L-J

    2018-04-01

    To establish a real-time predictive scoring model based on sonographic characteristics for identifying malignant cervical lymph nodes (LNs) in cancer patients after neck irradiation. One-hundred forty-four irradiation-treated patients underwent ultrasonography and ultrasound-guided fine-needle aspirations (USgFNAs), and the resultant data were used to construct a real-time and computerised predictive scoring model. This scoring system was further compared with our previously proposed prediction model. A predictive scoring model, 1.35 × (L axis) + 2.03 × (S axis) + 2.27 × (margin) + 1.48 × (echogenic hilum) + 3.7, was generated by stepwise multivariate logistic regression analysis. Neck LNs were considered to be malignant when the score was ≥ 7, corresponding to a sensitivity of 85.5%, specificity of 79.4%, positive predictive value (PPV) of 82.3%, negative predictive value (NPV) of 83.1%, and overall accuracy of 82.6%. When this new model and the original model were compared, the areas under the receiver operating characteristic curve (c-statistic) were 0.89 and 0.81, respectively (P < .05). A real-time sonographic predictive scoring model was constructed to provide prompt and reliable guidance for USgFNA biopsies to manage cervical LNs after neck irradiation. © 2017 John Wiley & Sons Ltd.

  3. Pre-launch Optical Characteristics of the Oculus-ASR Nanosatellite for Attitude and Shape Recognition Experiments

    DTIC Science & Technology

    2011-12-02

    construction and validation of predictive computer models such as those used in Time-domain Analysis Simulation for Advanced Tracking (TASAT), a...characterization data, successful construction and validation of predictive computer models was accomplished. And an investigation in pose determination from...currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 2. REPORT TYPE 3. DATES

  4. Experience of Time Passage:. Phenomenology, Psychophysics, and Biophysical Modelling

    NASA Astrophysics Data System (ADS)

    Wackermann, Jiří

    2005-10-01

    The experience of time's passing appears, from the 1st person perspective, to be a primordial subjective experience, seemingly inaccessible to the 3rd person accounts of time perception (psychophysics, cognitive psychology). In our analysis of the `dual klepsydra' model of reproduction of temporal durations, time passage occurs as a cognitive construct, based upon more elementary (`proto-cognitive') function of the psychophysical organism. This conclusion contradicts the common concepts of `subjective' or `psychological' time as readings of an `internal clock'. Our study shows how phenomenological, experimental and modelling approaches can be fruitfully combined.

  5. Typology of State Types: Persistence and Transition

    DTIC Science & Technology

    2015-04-28

    is the lack of positive transition among the weakest states. Our findings are derived from a minimalist construct of a refined time series dataset...states based on a „ minimalist ‟ construct of the Country Indicators for Foreign Policy (CIFP) fragile states project and its core structural...begin with the rationale for developing a minimalist construct of a state typology model (STM), similar to the approach taken by Gravingholt, Ziaja

  6. Programming in a proposed 9X distributed Ada

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Goldsack, Stephen J.; Holzbach-Valero, A. A.

    1991-01-01

    The studies of the proposed Ada 9X constructs for distribution, now referred to as AdaPT are reported. The goals for this time period were to revise the chosen example scenario and to begin studying about how the proposed constructs might be implemented. The example scenario chosen is the Submarine Combat Information Center (CIC) developed by IBM for the Navy. The specification provided by IBM was preliminary and had several deficiencies. To address these problems, some changes to the scenario specification were made. Some of the more important changes include: (1) addition of a system database management function; (2) addition of a fourth processing unit to the standard resources; (3) addition of an operator console interface function; and (4) removal of the time synchronization function. To implement the CIC scenario in AdaPT, the decided strategy were publics, partitions, and nodes. The principle purpose for implementing the CIC scenario was to demonstrate how the AdaPT constructs interact with the program structure. While considering ways that the AdaPt constructs might be translated to Ada 83, it was observed that the partition construct could reasonably be modeled as an abstract data type. Although this gives a useful method of modeling partitions, it does not at all address the configuration aspects on the node construct.

  7. A survival tree method for the analysis of discrete event times in clinical and epidemiological studies.

    PubMed

    Schmid, Matthias; Küchenhoff, Helmut; Hoerauf, Achim; Tutz, Gerhard

    2016-02-28

    Survival trees are a popular alternative to parametric survival modeling when there are interactions between the predictor variables or when the aim is to stratify patients into prognostic subgroups. A limitation of classical survival tree methodology is that most algorithms for tree construction are designed for continuous outcome variables. Hence, classical methods might not be appropriate if failure time data are measured on a discrete time scale (as is often the case in longitudinal studies where data are collected, e.g., quarterly or yearly). To address this issue, we develop a method for discrete survival tree construction. The proposed technique is based on the result that the likelihood of a discrete survival model is equivalent to the likelihood of a regression model for binary outcome data. Hence, we modify tree construction methods for binary outcomes such that they result in optimized partitions for the estimation of discrete hazard functions. By applying the proposed method to data from a randomized trial in patients with filarial lymphedema, we demonstrate how discrete survival trees can be used to identify clinically relevant patient groups with similar survival behavior. Copyright © 2015 John Wiley & Sons, Ltd.

  8. The Feeding Practices and Structure Questionnaire (FPSQ-28): A parsimonious version validated for longitudinal use from 2 to 5 years.

    PubMed

    Jansen, Elena; Williams, Kate E; Mallan, Kimberley M; Nicholson, Jan M; Daniels, Lynne A

    2016-05-01

    Prospective studies and intervention evaluations that examine change over time assume that measurement tools measure the same construct at each occasion. In the area of parent-child feeding practices, longitudinal measurement properties of the questionnaires used are rarely verified. To ascertain that measured change in feeding practices reflects true change rather than change in the assessment, structure, or conceptualisation of the constructs over time, this study examined longitudinal measurement invariance of the Feeding Practices and Structure Questionnaire (FPSQ) subscales (9 constructs; 40 items) across 3 time points. Mothers participating in the NOURISH trial reported their feeding practices when children were aged 2, 3.7, and 5 years (N = 404). Confirmatory Factor Analysis (CFA) within a structural equation modelling framework was used. Comparisons of initial cross-sectional models followed by longitudinal modelling of subscales, resulted in the removal of 12 items, including two redundant or poorly performing subscales. The resulting 28-item FPSQ-28 comprised 7 multi-item subscales: Reward for Behaviour, Reward for Eating, Persuasive Feeding, Overt Restriction, Covert Restriction, Structured Meal Setting and Structured Meal Timing. All subscales showed good fit over 3 time points and each displayed at least partial scalar (thresholds equal) longitudinal measurement invariance. We recommend the use of a separate single item indicator to assess the family meal setting. This is the first study to examine longitudinal measurement invariance in a feeding practices questionnaire. Invariance was established, indicating that the subscales of the shortened FPSQ-28 can be used with mothers to validly assess change in 7 feeding constructs in samples of children aged 2-5 years of age. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. A Coupled model for ERT monitoring of contaminated sites

    NASA Astrophysics Data System (ADS)

    Wang, Yuling; Zhang, Bo; Gong, Shulan; Xu, Ya

    2018-02-01

    The performance of electrical resistivity tomography (ERT) system is usually investigated using a fixed resistivity distribution model in numerical simulation study. In this paper, a method to construct a time-varying resistivity model by coupling water transport, solute transport and constant current field is proposed for ERT monitoring of contaminated sites. Using the proposed method, a monitoring model is constructed for a contaminated site with a pollution region on the surface and ERT monitoring results at different time is calculated by the finite element method. The results show that ERT monitoring profiles can effectively reflect the increase of the pollution area caused by the diffusion of pollutants, but the extent of the pollution is not exactly the same as the actual situation. The model can be extended to any other case and can be used to scheme design and results analysis for ERT monitoring.

  10. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  11. Cultural safety as an ethic of care: a praxiological process.

    PubMed

    McEldowney, Rose; Connor, Margaret J

    2011-10-01

    New writings broadening the construct of cultural safety, a construct initiated in Aotearoa New Zealand, are beginning to appear in the literature. Therefore, it is considered timely to integrate these writings and advance the construct into a new theoretical model. The new model reconfigures the constructs of cultural safety and cultural competence as an ethic of care informed by a postmodern perspective. Central to the new model are three interwoven, co-occurring components: an ethic of care, which unfolds within a praxiological process shaped by the context. Context is expanded through identifying the three concepts of relationality, generic competence, and collectivity, which are integral to each client-nurse encounter. The competence associated with cultural safety as an ethic of care is always in the process of development. Clients and nurses engage in a dialogue to establish the level of cultural safety achieved at given points in a care trajectory.

  12. A voxel-based finite element model for the prediction of bladder deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai Xiangfei; Herk, Marcel van; Hulshof, Maarten C. C. M.

    2012-01-15

    Purpose: A finite element (FE) bladder model was previously developed to predict bladder deformation caused by bladder filling change. However, two factors prevent a wide application of FE models: (1) the labor required to construct a FE model with high quality mesh and (2) long computation time needed to construct the FE model and solve the FE equations. In this work, we address these issues by constructing a low-resolution voxel-based FE bladder model directly from the binary segmentation images and compare the accuracy and computational efficiency of the voxel-based model used to simulate bladder deformation with those of a classicalmore » FE model with a tetrahedral mesh. Methods: For ten healthy volunteers, a series of MRI scans of the pelvic region was recorded at regular intervals of 10 min over 1 h. For this series of scans, the bladder volume gradually increased while rectal volume remained constant. All pelvic structures were defined from a reference image for each volunteer, including bladder wall, small bowel, prostate (male), uterus (female), rectum, pelvic bone, spine, and the rest of the body. Four separate FE models were constructed from these structures: one with a tetrahedral mesh (used in previous study), one with a uniform hexahedral mesh, one with a nonuniform hexahedral mesh, and one with a low-resolution nonuniform hexahedral mesh. Appropriate material properties were assigned to all structures and uniform pressure was applied to the inner bladder wall to simulate bladder deformation from urine inflow. Performance of the hexahedral meshes was evaluated against the performance of the standard tetrahedral mesh by comparing the accuracy of bladder shape prediction and computational efficiency. Results: FE model with a hexahedral mesh can be quickly and automatically constructed. No substantial differences were observed between the simulation results of the tetrahedral mesh and hexahedral meshes (<1% difference in mean dice similarity coefficient to manual contours and <0.02 cm difference in mean standard deviation of residual errors). The average equation solving time (without manual intervention) for the first two types of hexahedral meshes increased to 2.3 h and 2.6 h compared to the 1.1 h needed for the tetrahedral mesh, however, the low-resolution nonuniform hexahedral mesh dramatically decreased the equation solving time to 3 min without reducing accuracy. Conclusions: Voxel-based mesh generation allows fast, automatic, and robust creation of finite element bladder models directly from binary segmentation images without user intervention. Even the low-resolution voxel-based hexahedral mesh yields comparable accuracy in bladder shape prediction and more than 20 times faster in computational speed compared to the tetrahedral mesh. This approach makes it more feasible and accessible to apply FE method to model bladder deformation in adaptive radiotherapy.« less

  13. Gravitational Radiation Characteristics of Nonspinning Black-Hole Binaries

    NASA Technical Reports Server (NTRS)

    Kelly, B. J.; Baker, J. G.; Boggs, W. D.; Centrella, J. M.; vanMeter, J. R.; McWilliams, S. T.

    2008-01-01

    We present a detailed descriptive analysis of the gravitational radiation from binary mergers of non-spinning black holes, based on numerical relativity simulations of systems varying from equal-mass to a 6:1 mass ratio. Our analysis covers amplitude and phase characteristics of the radiation, suggesting a unified picture of the waveforms' dominant features in terms of an implicit rotating source, applying uniformly to the full wavetrain, from inspiral through ringdown. We construct a model of the late-stage frequency evolution that fits the l = m modes, and identify late-time relationships between waveform frequency and amplitude. These relationships allow us to construct a predictive model for the late-time waveforms, an alternative to the common practice of modelling by a sum of quasinormal mode overtones. We demonstrate an application of this in a new effective-one-body-based analytic waveform model.

  14. Programmable logic construction kits for hyper-real-time neuronal modeling.

    PubMed

    Guerrero-Rivera, Ruben; Morrison, Abigail; Diesmann, Markus; Pearce, Tim C

    2006-11-01

    Programmable logic designs are presented that achieve exact integration of leaky integrate-and-fire soma and dynamical synapse neuronal models and incorporate spike-time dependent plasticity and axonal delays. Highly accurate numerical performance has been achieved by modifying simpler forward-Euler-based circuitry requiring minimal circuit allocation, which, as we show, behaves equivalently to exact integration. These designs have been implemented and simulated at the behavioral and physical device levels, demonstrating close agreement with both numerical and analytical results. By exploiting finely grained parallelism and single clock cycle numerical iteration, these designs achieve simulation speeds at least five orders of magnitude faster than the nervous system, termed here hyper-real-time operation, when deployed on commercially available field-programmable gate array (FPGA) devices. Taken together, our designs form a programmable logic construction kit of commonly used neuronal model elements that supports the building of large and complex architectures of spiking neuron networks for real-time neuromorphic implementation, neurophysiological interfacing, or efficient parameter space investigations.

  15. Engineering a Functional Small RNA Negative Autoregulation Network with Model-Guided Design.

    PubMed

    Hu, Chelsea Y; Takahashi, Melissa K; Zhang, Yan; Lucks, Julius B

    2018-05-22

    RNA regulators are powerful components of the synthetic biology toolbox. Here, we expand the repertoire of synthetic gene networks built from these regulators by constructing a transcriptional negative autoregulation (NAR) network out of small RNAs (sRNAs). NAR network motifs are core motifs of natural genetic networks, and are known for reducing network response time and steady state signal. Here we use cell-free transcription-translation (TX-TL) reactions and a computational model to design and prototype sRNA NAR constructs. Using parameter sensitivity analysis, we design a simple set of experiments that allow us to accurately predict NAR function in TX-TL. We transfer successful network designs into Escherichia coli and show that our sRNA transcriptional network reduces both network response time and steady-state gene expression. This work broadens our ability to construct increasingly sophisticated RNA genetic networks with predictable function.

  16. Data Dependent Peak Model Based Spectrum Deconvolution for Analysis of High Resolution LC-MS Data

    PubMed Central

    2015-01-01

    A data dependent peak model (DDPM) based spectrum deconvolution method was developed for analysis of high resolution LC-MS data. To construct the selected ion chromatogram (XIC), a clustering method, the density based spatial clustering of applications with noise (DBSCAN), is applied to all m/z values of an LC-MS data set to group the m/z values into each XIC. The DBSCAN constructs XICs without the need for a user defined m/z variation window. After the XIC construction, the peaks of molecular ions in each XIC are detected using both the first and the second derivative tests, followed by an optimized chromatographic peak model selection method for peak deconvolution. A total of six chromatographic peak models are considered, including Gaussian, log-normal, Poisson, gamma, exponentially modified Gaussian, and hybrid of exponential and Gaussian models. The abundant nonoverlapping peaks are chosen to find the optimal peak models that are both data- and retention-time-dependent. Analysis of 18 spiked-in LC-MS data demonstrates that the proposed DDPM spectrum deconvolution method outperforms the traditional method. On average, the DDPM approach not only detected 58 more chromatographic peaks from each of the testing LC-MS data but also improved the retention time and peak area 3% and 6%, respectively. PMID:24533635

  17. Myocardial electrical conduction blockade time dominated by irradiance on photodynamic reaction: in vitro and in silico study

    NASA Astrophysics Data System (ADS)

    Ogawa, Emiyu; Arai, Tsunenori

    2018-02-01

    The time for electrical conduction blockade induced by a photodynamic reaction was studied on a myocardial cell wire in vitro and an in silico simulation model was constructed to understand the necessary time for electrical conduction blockade for the wire. Vulnerable state of the cells on a laser interaction would be an unstable and undesirable state since the cells might progress to completely damaged or repaired to change significantly therapeutic effect. So that in silico model, which can calculate the vulnerable cell state, is needed. Understanding an immediate electrical conduction blockade is needed for our proposed new methodology for tachyarrhythmia catheter ablation applying a photodynamic reaction. We studied the electrical conduction blockade occurrence on the electrical conduction wire made of cultured myocardial cells in a line shape and constructed in silico model based on this experimental data. The intracellular Ca2+ ion concentrations were obtained using Fluo-4 AM dye under a confocal laser microscope. A cross-correlation function was used for the electrical conduction blockade judgment. The photodynamic reaction was performed under the confocal microscopy with 3-120 mW/cm2 in irradiance by the diode laser with 663 nm in wavelength. We obtained that the time for the electrical conduction blockade decreased with the irradiance increasing. We constructed a simulation model composed of three states; living cells, vulnerable cells, and blocked cells, using the obtained experimental data and we found the rate constant by an optimization using a conjugate gradient method.

  18. The Earth's magnetosphere modeling and ISO standard

    NASA Astrophysics Data System (ADS)

    Alexeev, I.

    The empirical model developed by Tsyganenko T96 is constructed by minimizing the rms deviation from the large magnetospheric data base Fairfield et al 1994 which contains Earth s magnetospheric magnetic field measurements accumulated during many years The applicability of the T96 model is limited mainly by quiet conditions in the solar wind along the Earth orbit But contrary to the internal planet s field the external magnetospheric magnetic field sources are much more time-dependent A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions The last version of the Tsyganenko model has been constructed for a geomagnetic storm time interval This version based on the more accurate and physically consistent approach in which each source of the magnetic field would have its own relaxation timescale and a driving function based on an individual best fit combination of the solar wind and IMF parameters The same method has been used previously for paraboloid model construction This method is based on a priori information about the global magnetospheric current systems structure Each current system is included as a separate block module in the magnetospheric model As it was shown by the spacecraft magnetometer data there are three current systems which are the main contributors to the external magnetospheric magnetic field magnetopause currents ring current and tail current sheet Paraboloid model is based on an analytical solution of the Laplace

  19. Cultural shift towards sustainability in the construction industry of Hong Kong.

    PubMed

    Yip Robin, C P; Poon, C S

    2009-08-01

    Sustainable development is forward-looking; it is a continuous mission for future developments of human society. A genuinely sustainable society is one that initiates developments in sustainable ways. The development of a genuinely sustainable society is supported by its citizens who think and act according to a recognized code of conduct - the sustainable culture. Similar to other forms of culture, sustainable culture of a society is not static, but changes over time. The changes found in a sustainable culture are reflections of the status of sustainability in a society and these changes should be measured from time to time. The resulting measurement gives very important information for decision-makers, in the government and in the private sector, to examine the magnitude of changes that have taken place in a given period of time. The results will also enable them to review and adjust policies in order to better accommodate changes according to the trends of society. This paper provides a method - the T-model, to investigate and measure the extent of change of sustainable culture through two extensive surveys among participants of the construction industry of Hong Kong. The change in sustainable culture is reflected by the change in attitude and practice among construction participants, this can be found in their performance in project development, design and construction operations. The data of these changes are collected and converted to numerical scores. The T-model synthesized these scores and revealed the change of sustainable culture within the specific study time frame.

  20. Diffusion maps, clustering and fuzzy Markov modeling in peptide folding transitions

    NASA Astrophysics Data System (ADS)

    Nedialkova, Lilia V.; Amat, Miguel A.; Kevrekidis, Ioannis G.; Hummer, Gerhard

    2014-09-01

    Using the helix-coil transitions of alanine pentapeptide as an illustrative example, we demonstrate the use of diffusion maps in the analysis of molecular dynamics simulation trajectories. Diffusion maps and other nonlinear data-mining techniques provide powerful tools to visualize the distribution of structures in conformation space. The resulting low-dimensional representations help in partitioning conformation space, and in constructing Markov state models that capture the conformational dynamics. In an initial step, we use diffusion maps to reduce the dimensionality of the conformational dynamics of Ala5. The resulting pretreated data are then used in a clustering step. The identified clusters show excellent overlap with clusters obtained previously by using the backbone dihedral angles as input, with small—but nontrivial—differences reflecting torsional degrees of freedom ignored in the earlier approach. We then construct a Markov state model describing the conformational dynamics in terms of a discrete-time random walk between the clusters. We show that by combining fuzzy C-means clustering with a transition-based assignment of states, we can construct robust Markov state models. This state-assignment procedure suppresses short-time memory effects that result from the non-Markovianity of the dynamics projected onto the space of clusters. In a comparison with previous work, we demonstrate how manifold learning techniques may complement and enhance informed intuition commonly used to construct reduced descriptions of the dynamics in molecular conformation space.

  1. Diffusion maps, clustering and fuzzy Markov modeling in peptide folding transitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nedialkova, Lilia V.; Amat, Miguel A.; Kevrekidis, Ioannis G., E-mail: yannis@princeton.edu, E-mail: gerhard.hummer@biophys.mpg.de

    Using the helix-coil transitions of alanine pentapeptide as an illustrative example, we demonstrate the use of diffusion maps in the analysis of molecular dynamics simulation trajectories. Diffusion maps and other nonlinear data-mining techniques provide powerful tools to visualize the distribution of structures in conformation space. The resulting low-dimensional representations help in partitioning conformation space, and in constructing Markov state models that capture the conformational dynamics. In an initial step, we use diffusion maps to reduce the dimensionality of the conformational dynamics of Ala5. The resulting pretreated data are then used in a clustering step. The identified clusters show excellent overlapmore » with clusters obtained previously by using the backbone dihedral angles as input, with small—but nontrivial—differences reflecting torsional degrees of freedom ignored in the earlier approach. We then construct a Markov state model describing the conformational dynamics in terms of a discrete-time random walk between the clusters. We show that by combining fuzzy C-means clustering with a transition-based assignment of states, we can construct robust Markov state models. This state-assignment procedure suppresses short-time memory effects that result from the non-Markovianity of the dynamics projected onto the space of clusters. In a comparison with previous work, we demonstrate how manifold learning techniques may complement and enhance informed intuition commonly used to construct reduced descriptions of the dynamics in molecular conformation space.« less

  2. Diffusion maps, clustering and fuzzy Markov modeling in peptide folding transitions

    PubMed Central

    Nedialkova, Lilia V.; Amat, Miguel A.; Kevrekidis, Ioannis G.; Hummer, Gerhard

    2014-01-01

    Using the helix-coil transitions of alanine pentapeptide as an illustrative example, we demonstrate the use of diffusion maps in the analysis of molecular dynamics simulation trajectories. Diffusion maps and other nonlinear data-mining techniques provide powerful tools to visualize the distribution of structures in conformation space. The resulting low-dimensional representations help in partitioning conformation space, and in constructing Markov state models that capture the conformational dynamics. In an initial step, we use diffusion maps to reduce the dimensionality of the conformational dynamics of Ala5. The resulting pretreated data are then used in a clustering step. The identified clusters show excellent overlap with clusters obtained previously by using the backbone dihedral angles as input, with small—but nontrivial—differences reflecting torsional degrees of freedom ignored in the earlier approach. We then construct a Markov state model describing the conformational dynamics in terms of a discrete-time random walk between the clusters. We show that by combining fuzzy C-means clustering with a transition-based assignment of states, we can construct robust Markov state models. This state-assignment procedure suppresses short-time memory effects that result from the non-Markovianity of the dynamics projected onto the space of clusters. In a comparison with previous work, we demonstrate how manifold learning techniques may complement and enhance informed intuition commonly used to construct reduced descriptions of the dynamics in molecular conformation space. PMID:25240340

  3. A Real-Time Construction Safety Monitoring System for Hazardous Gas Integrating Wireless Sensor Network and Building Information Modeling Technologies

    PubMed Central

    Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng

    2018-01-01

    In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications. PMID:29393887

  4. Investigate attractiveness of toll roads.

    DOT National Transportation Integrated Search

    2015-03-01

    HOT facilities are used as a solution for congestion mitigation instead of constructing or expanding the capacity of : existing roadways. Although toll roads modeling has been researched for a long time, High Occupancy Toll (HOT) : modeling is relati...

  5. An inexpensive, easily constructed, reusable task trainer for simulating ultrasound-guided pericardiocentesis.

    PubMed

    Zerth, Herb; Harwood, Robert; Tommaso, Laura; Girzadas, Daniel V

    2012-12-01

    Pericardiocentesis is a low-frequency, high-risk procedure integral to the practice of emergency medicine. Ultrasound-guided pericardiocentesis is the preferred technique for providing this critical intervention. Traditionally, emergency physicians learned pericardiocentesis in real time, at the bedside, on critically ill patients. Medical education is moving toward simulation for training and assessment of procedures such as pericardiocentesis because it allows learners to practice time-sensitive skills without risk to patient or learner. The retail market for models for pericardiocentesis practice is limited and expensive. We have developed an ultrasound-guided pericardiocentesis task trainer that allows the physician to insert a needle under ultrasound guidance, pierce the "pericardial sac" and aspirate "blood." Our model can be simply constructed in a home kitchen, and the overall preparation time is 1 h. Our model costs $20.00 (US, 2008). Materials needed for the construction include 16 ounces of plain gelatin, one large balloon, one golf ball, food coloring, non-stick cooking spray, one wooden cooking skewer, surgical iodine solution, and a 4-quart sized plastic food storage container. Refrigeration and a heat source for cooking are also required. Once prepared, the model is usable for 2 weeks at room temperature and may be preserved an additional week if refrigerated. When the model shows signs of wear, it can be easily remade, by simply recycling the existing materials. The self-made model was well liked by training staff due to accessibility of a simulation model, and by learners of the technique as they felt more at ease performing pericardiocentesis on a live patient. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. A hydroelastic model of hydrocephalus

    NASA Astrophysics Data System (ADS)

    Smillie, Alan; Sobey, Ian; Molnar, Zoltan

    2005-09-01

    We combine elements of poroelasticity and of fluid mechanics to construct a mathematical model of the human brain and ventricular system. The model is used to study hydrocephalus, a pathological condition in which the normal flow of the cerebrospinal fluid is disturbed, causing the brain to become deformed. Our model extends recent work in this area by including flow through the aqueduct, by incorporating boundary conditions that we believe accurately represent the anatomy of the brain and by including time dependence. This enables us to construct a quantitative model of the onset, development and treatment of this condition. We formulate and solve the governing equations and boundary conditions for this model and give results that are relevant to clinical observations.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, D. I.; Han, S. H.

    A PSA analyst has been manually determining fire-induced component failure modes and modeling them into the PSA logics. These can be difficult and time-consuming tasks as they need much information and many events are to be modeled. KAERI has been developing the IPRO-ZONE (interface program for constructing zone effect table) to facilitate fire PSA works for identifying and modeling fire-induced component failure modes, and to construct a one top fire event PSA model. With the output of the IPRO-ZONE, the AIMS-PSA, and internal event one top PSA model, one top fire events PSA model is automatically constructed. The outputs ofmore » the IPRO-ZONE include information on fire zones/fire scenarios, fire propagation areas, equipment failure modes affected by a fire, internal PSA basic events corresponding to fire-induced equipment failure modes, and fire events to be modeled. This paper introduces the IPRO-ZONE, and its application results to fire PSA of Ulchin Unit 3 and SMART(System-integrated Modular Advanced Reactor). (authors)« less

  8. Development of a physiologically based pharmacokinetic model for bisphenol A in pregnant mice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawamoto, Yuko; Matsuyama, Wakoto; Wada, Masahiro

    Bisphenol A (BPA) is a weakly estrogenic monomer used to produce polymers for food contact and other applications, so there is potential for oral exposure of humans to trace amounts via ingestion. To date, no physiologically based pharmacokinetic (PBPK) model has been located for BPA in pregnant mice with or without fetuses. An estimate by a mathematical model is essential since information on humans is difficult to obtain experimentally. The PBPK model was constructed based on the pharmacokinetic data of our experiment following single oral administration of BPA to pregnant mice. The risk assessment of bisphenol A (BPA) on themore » development of human offspring is an important issue. There have been limited data on the exposure level of human fetuses to BPA (e.g. BPA concentration in cord blood) and no information is available on the pharmacokinetics of BPA in humans with or without fetuses. In the present study, we developed a physiologically based pharmacokinetic (PBPK) model describing the pharmacokinetics of BPA in a pregnant mouse with the prospect of future extrapolation to humans. The PBPK model was constructed based on the pharmacokinetic data of an experiment we executed on pregnant mice following single oral administration of BPA. The model could describe the rapid transfer of BPA through the placenta to the fetus and the slow disappearance from fetuses. The simulated time courses after three-time repeated oral administrations of BPA by the constructed model fitted well with the experimental data, and the simulation for the 10 times lower dose was also consistent with the experiment. This suggested that the PBPK model for BPA in pregnant mice was successfully verified and is highly promising for extrapolation to humans who are expected to be exposed more chronically to lower doses.« less

  9. The Full Scale Seal Experiment - A Seal Industrial Prototype for Cigeo - 13106

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lebon, P.; Bosgiraud, J.M.; Foin, R.

    2013-07-01

    The Full Scale Seal (FSS) Experiment is one of various experiments implemented by Andra, within the frame of the Cigeo (the French Deep Geological Repository) Project development, to demonstrate the technical construction feasibility and performance of seals to be constructed, at time of Repository components (shafts, ramps, drifts, disposal vaults) progressive closure. FSS is built inside a drift model fabricated on surface for the purpose. Prior to the scale 1:1 seal construction test, various design tasks are scheduled. They include the engineering work on the drift model to make it fit with the experimental needs, on the various work sequencesmore » anticipated for the swelling clay core emplacement and the concrete containment plugs construction, on the specialized handling tools (and installation equipment) manufactured and delivered for the purpose, and of course on the various swelling clay materials and low pH (below 11) concrete formulations developed for the application. The engineering of the 'seal-as-built' commissioning means (tools and methodology) must also be dealt with. The FSS construction experiment is a technological demonstrator, thus it is not focused on the phenomenological survey (and by consequence, on the performance and behaviour forecast). As such, no hydration (forced or natural) is planned. However, the FSS implementation (in particular via the construction and commissioning activities carried out) is a key milestone in view of comforting phenomenological extrapolation in time and scale. The FSS experiment also allows for qualifying the commissioning methods of a real sealing system in the Repository, as built, at time of industrial operations. (authors)« less

  10. Constructing Alternate Assessment Cohorts: An Oregon Perspective. Research Brief 3

    ERIC Educational Resources Information Center

    Saven, Jessica L.; Farley, Dan; Tindal, Gerald

    2013-01-01

    Longitudinally modeling the growth of students with significant cognitive disabilities (SWSCDs) on alternate assessments based on alternate achievement standards (AA-AAS) presents many challenges for states. The number of students in Grades 3-8 who remain in a cohort group varies over time, depending on the methods used to construct the…

  11. Reliable results from stochastic simulation models

    Treesearch

    Donald L., Jr. Gochenour; Leonard R. Johnson

    1973-01-01

    Development of a computer simulation model is usually done without fully considering how long the model should run (e.g. computer time) before the results are reliable. However construction of confidence intervals (CI) about critical output parameters from the simulation model makes it possible to determine the point where model results are reliable. If the results are...

  12. A Multivariate Multilevel Approach to the Modeling of Accuracy and Speed of Test Takers

    ERIC Educational Resources Information Center

    Klein Entink, R. H.; Fox, J. P.; van der Linden, W. J.

    2009-01-01

    Response times on test items are easily collected in modern computerized testing. When collecting both (binary) responses and (continuous) response times on test items, it is possible to measure the accuracy and speed of test takers. To study the relationships between these two constructs, the model is extended with a multivariate multilevel…

  13. Scalability of Semi-Implicit Time Integrators for Nonhydrostatic Galerkin-based Atmospheric Models on Large Scale Cluster

    DTIC Science & Technology

    2011-01-01

    present performance statistics to explain the scalability behavior. Keywords-atmospheric models, time intergrators , MPI, scal- ability, performance; I...across inter-element bound- aries. Basis functions are constructed as tensor products of Lagrange polynomials ψi (x) = hα(ξ) ⊗ hβ(η) ⊗ hγ(ζ)., where hα

  14. The role of logistic constraints in termite construction of chambers and tunnels.

    PubMed

    Ladley, Dan; Bullock, Seth

    2005-06-21

    In previous models of the building behaviour of termites, physical and logistic constraints that limit the movement of termites and pheromones have been neglected. Here, we present an individual-based model of termite construction that includes idealized constraints on the diffusion of pheromones, the movement of termites, and the integrity of the architecture that they construct. The model allows us to explore the extent to which the results of previous idealized models (typically realised in one or two dimensions via a set of coupled partial differential equations) generalize to a physical, 3-D environment. Moreover we are able to investigate new processes and architectures that rely upon these features. We explore the role of stigmergic recruitment in pillar formation, wall building, and the construction of royal chambers, tunnels and intersections. In addition, for the first time, we demonstrate the way in which the physicality of partially built structures can help termites to achieve efficient tunnel structures and to establish and maintain entrances in royal chambers. As such we show that, in at least some cases, logistic constraints can be important or even necessary in order for termites to achieve efficient, effective constructions.

  15. Predicting language outcomes for children learning AAC: Child and environmental factors

    PubMed Central

    Brady, Nancy C.; Thiemann-Bourque, Kathy; Fleming, Kandace; Matthews, Kris

    2014-01-01

    Purpose To investigate a model of language development for nonverbal preschool age children learning to communicate with AAC. Method Ninety-three preschool children with intellectual disabilities were assessed at Time 1, and 82 of these children were assessed one year later at Time 2. The outcome variable was the number of different words the children produced (with speech, sign or SGD). Children’s intrinsic predictor for language was modeled as a latent variable consisting of cognitive development, comprehension, play, and nonverbal communication complexity. Adult input at school and home, and amount of AAC instruction were proposed mediators of vocabulary acquisition. Results A confirmatory factor analysis revealed that measures converged as a coherent construct and an SEM model indicated that the intrinsic child predictor construct predicted different words children produced. The amount of input received at home but not at school was a significant mediator. Conclusions Our hypothesized model accurately reflected a latent construct of Intrinsic Symbolic Factor (ISF). Children who evidenced higher initial levels of ISF and more adult input at home produced more words one year later. Findings support the need to assess multiple child variables, and suggest interventions directed to the indicators of ISF and input. PMID:23785187

  16. Scanning of speechless comics changes spatial biases in mental model construction.

    PubMed

    Román, Antonio; Flumini, Andrea; Santiago, Julio

    2018-08-05

    The mental representation of both time and number shows lateral spatial biases, which can be affected by habitual reading and writing direction. However, this effect is in place before children begin to read. One potential early cause is the experiences of looking at picture books together with a carer, as those images also follow the directionality of the script. What is the underlying mechanism for this effect? In the present study, we test the possibility that such experiences induce spatial biases in mental model construction, a mechanism which is a good candidate to induce the biases observed with numbers and times. We presented a speechless comic in either standard (left-to-right) or mirror-reversed (right-to-left) form to adult Spanish participants. We then asked them to draw the scene depicted by sentences like 'the square is between the cross and the circle'. The position of the lateral objects in these drawings reveals the spatial biases at work when building mental models in working memory. Under conditions of highly consistent directionality, the mirror comic changed pre-existing lateral biases. Processes of mental model construction in working memory stand as a potential mechanism for the generation of spatial biases for time and number.This article is part of the theme issue 'Varieties of abstract concepts: development, use and representation in the brain'. © 2018 The Author(s).

  17. A model for methane production in sewers.

    PubMed

    Chaosakul, Thitirat; Koottatep, Thammarat; Polprasert, Chongrak

    2014-09-19

    Most sewers in developing countries are combined sewers which receive stormwater and effluent from septic tanks or cesspools of households and buildings. Although the wastewater strength in these sewers is usually lower than those in developed countries, due to improper construction and maintenance, the hydraulic retention time (HRT) could be relatively long and resulting considerable greenhouse gas (GHG) production. This study proposed an empirical model to predict the quantity of methane production in gravity-flow sewers based on relevant parameters such as surface area to volume ratio (A/V) of sewer, hydraulic retention time (HRT) and wastewater temperature. The model was developed from field survey data of gravity-flow sewers located in a peri-urban area, central Thailand and validated with field data of a sewer system of the Gold Coast area, Queensland, Australia. Application of this model to improve construction and maintenance of gravity-flow sewers to minimize GHG production and reduce global warming is presented.

  18. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  19. Integrable Floquet dynamics, generalized exclusion processes and "fused" matrix ansatz

    NASA Astrophysics Data System (ADS)

    Vanicat, Matthieu

    2018-04-01

    We present a general method for constructing integrable stochastic processes, with two-step discrete time Floquet dynamics, from the transfer matrix formalism. The models can be interpreted as a discrete time parallel update. The method can be applied for both periodic and open boundary conditions. We also show how the stationary distribution can be built as a matrix product state. As an illustration we construct parallel discrete time dynamics associated with the R-matrix of the SSEP and of the ASEP, and provide the associated stationary distributions in a matrix product form. We use this general framework to introduce new integrable generalized exclusion processes, where a fixed number of particles is allowed on each lattice site in opposition to the (single particle) exclusion process models. They are constructed using the fusion procedure of R-matrices (and K-matrices for open boundary conditions) for the SSEP and ASEP. We develop a new method, that we named "fused" matrix ansatz, to build explicitly the stationary distribution in a matrix product form. We use this algebraic structure to compute physical observables such as the correlation functions and the mean particle current.

  20. Modeling biological pathway dynamics with timed automata.

    PubMed

    Schivo, Stefano; Scholma, Jetse; Wanders, Brend; Urquidi Camacho, Ricardo A; van der Vet, Paul E; Karperien, Marcel; Langerak, Rom; van de Pol, Jaco; Post, Janine N

    2014-05-01

    Living cells are constantly subjected to a plethora of environmental stimuli that require integration into an appropriate cellular response. This integration takes place through signal transduction events that form tightly interconnected networks. The understanding of these networks requires capturing their dynamics through computational support and models. ANIMO (analysis of Networks with Interactive Modeling) is a tool that enables the construction and exploration of executable models of biological networks, helping to derive hypotheses and to plan wet-lab experiments. The tool is based on the formalism of Timed Automata, which can be analyzed via the UPPAAL model checker. Thanks to Timed Automata, we can provide a formal semantics for the domain-specific language used to represent signaling networks. This enforces precision and uniformity in the definition of signaling pathways, contributing to the integration of isolated signaling events into complex network models. We propose an approach to discretization of reaction kinetics that allows us to efficiently use UPPAAL as the computational engine to explore the dynamic behavior of the network of interest. A user-friendly interface hides the use of Timed Automata from the user, while keeping the expressive power intact. Abstraction to single-parameter kinetics speeds up construction of models that remain faithful enough to provide meaningful insight. The resulting dynamic behavior of the network components is displayed graphically, allowing for an intuitive and interactive modeling experience.

  1. Nonlinear techniques for forecasting solar activity directly from its time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.; Roszman, L.; Cooley, J.

    1992-01-01

    Numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series are presented. This approach makes it possible to extract dynamical invariants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), given a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.

  2. Nonlinear techniques for forecasting solar activity directly from its time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.; Roszman, L.; Cooley, J.

    1993-01-01

    This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.

  3. An empirical model of L-band scintillation S4 index constructed by using FORMOSAT-3/COSMIC data

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Ping; Bilitza, Dieter; Liu, Jann-Yenq; Caton, Ronald; Chang, Loren C.; Yeh, Wen-Hao

    2017-09-01

    Modern society relies heavily on the Global Navigation Satellite System (GNSS) technology for applications such as satellite communication, navigation, and positioning on the ground and/or aviation in the troposphere/stratosphere. However, ionospheric scintillations can severely impact GNSS systems and their related applications. In this study, a global empirical ionospheric scintillation model is constructed with S4-index data obtained by the FORMOSAT-3/COSMIC (F3/C) satellites during 2007-2014 (hereafter referred to as the F3CGS4 model). This model describes the S4-index as a function of local time, day of year, dip-latitude, and solar activity using the index PF10.7. The model reproduces the F3/C S4-index observations well, and yields good agreement with ground-based reception of satellite signals. This confirms that the constructed model can be used to forecast global L-band scintillations on the ground and in the near surface atmosphere.

  4. Research on strategy marine noise map based on i4ocean platform: Constructing flow and key approach

    NASA Astrophysics Data System (ADS)

    Huang, Baoxiang; Chen, Ge; Han, Yong

    2016-02-01

    Noise level in a marine environment has raised extensive concern in the scientific community. The research is carried out on i4Ocean platform following the process of ocean noise model integrating, noise data extracting, processing, visualizing, and interpreting, ocean noise map constructing and publishing. For the convenience of numerical computation, based on the characteristics of ocean noise field, a hybrid model related to spatial locations is suggested in the propagation model. The normal mode method K/I model is used for far field and ray method CANARY model is used for near field. Visualizing marine ambient noise data is critical to understanding and predicting marine noise for relevant decision making. Marine noise map can be constructed on virtual ocean scene. The systematic marine noise visualization framework includes preprocessing, coordinate transformation interpolation, and rendering. The simulation of ocean noise depends on realistic surface. Then the dynamic water simulation gird was improved with GPU fusion to achieve seamless combination with the visualization result of ocean noise. At the same time, the profile and spherical visualization include space, and time dimensionality were also provided for the vertical field characteristics of ocean ambient noise. Finally, marine noise map can be published with grid pre-processing and multistage cache technology to better serve the public.

  5. Real-Time MENTAT programming language and architecture

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  6. Validation of in vitro assays in three-dimensional human dermal constructs.

    PubMed

    Idrees, Ayesha; Chiono, Valeria; Ciardelli, Gianluca; Shah, Siegfried; Viebahn, Richard; Zhang, Xiang; Salber, Jochen

    2018-05-01

    Three-dimensional cell culture systems are urgently needed for cytocompatibility testing of biomaterials. This work aimed at the development of three-dimensional in vitro dermal skin models and their optimization for cytocompatibility evaluation. Initially "murine in vitro dermal construct" based on L929 cells was generated, leading to the development of "human in vitro dermal construct" consisting of normal human dermal fibroblasts in rat tail tendon collagen type I. To assess the viability of the cells, different assays CellTiter-Blue ® , RealTime-Glo ™ MT, and CellTiter-Glo ® (Promega) were evaluated to optimize the best-suited assay to the respective cell type and three-dimensional system. Z-stack imaging (Live/Dead and Phalloidin/DAPI-Promokine) was performed to visualize normal human dermal fibroblasts inside matrix revealing filopodia-like morphology and a uniform distribution of normal human dermal fibroblasts in matrix. CellTiter-Glo was found to be the optimal cell viability assay among those analyzed. CellTiter-Blue reagent affected the cell morphology of normal human dermal fibroblasts (unlike L929), suggesting an interference with cell biological activity, resulting in less reliable viability data. On the other hand, RealTime-Glo provided a linear signal only with a very low cell density, which made this assay unsuitable for this system. CellTiter-Glo adapted to three-dimensional dermal construct by optimizing the "shaking time" to enhance the reagent penetration and maximum adenosine triphosphate release, indicating 2.4 times higher viability value by shaking for 60 min than for 5 min. In addition, viability results showed that cells were viable inside the matrix. This model would be further advanced with more layers of skin to make a full thickness model.

  7. Architectural Large Constructed Environment. Modeling and Interaction Using Dynamic Simulations

    NASA Astrophysics Data System (ADS)

    Fiamma, P.

    2011-09-01

    How to use for the architectural design, the simulation coming from a large size data model? The topic is related to the phase coming usually after the acquisition of the data, during the construction of the model and especially after, when designers must have an interaction with the simulation, in order to develop and verify their idea. In the case of study, the concept of interaction includes the concept of real time "flows". The work develops contents and results that can be part of the large debate about the current connection between "architecture" and "movement". The focus of the work, is to realize a collaborative and participative virtual environment on which different specialist actors, client and final users can share knowledge, targets and constraints to better gain the aimed result. The goal is to have used a dynamic micro simulation digital resource that allows all the actors to explore the model in powerful and realistic way and to have a new type of interaction in a complex architectural scenario. On the one hand, the work represents a base of knowledge that can be implemented more and more; on the other hand the work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. The architectural design before, and the architectural fact after, both happen in a sort of "Spatial Analysis System". The way is open to offer to this "system", knowledge and theories, that can support architectural design work for every application and scale. We think that the presented work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. Architecture like a spatial configuration, that can be reconfigurable too through designing.

  8. Construction of Optimally Reduced Empirical Model by Spatially Distributed Climate Data

    NASA Astrophysics Data System (ADS)

    Gavrilov, A.; Mukhin, D.; Loskutov, E.; Feigin, A.

    2016-12-01

    We present an approach to empirical reconstruction of the evolution operator in stochastic form by space-distributed time series. The main problem in empirical modeling consists in choosing appropriate phase variables which can efficiently reduce the dimension of the model at minimal loss of information about system's dynamics which consequently leads to more robust model and better quality of the reconstruction. For this purpose we incorporate in the model two key steps. The first step is standard preliminary reduction of observed time series dimension by decomposition via certain empirical basis (e. g. empirical orthogonal function basis or its nonlinear or spatio-temporal generalizations). The second step is construction of an evolution operator by principal components (PCs) - the time series obtained by the decomposition. In this step we introduce a new way of reducing the dimension of the embedding in which the evolution operator is constructed. It is based on choosing proper combinations of delayed PCs to take into account the most significant spatio-temporal couplings. The evolution operator is sought as nonlinear random mapping parameterized using artificial neural networks (ANN). Bayesian approach is used to learn the model and to find optimal hyperparameters: the number of PCs, the dimension of the embedding, the degree of the nonlinearity of ANN. The results of application of the method to climate data (sea surface temperature, sea level pressure) and their comparing with the same method based on non-reduced embedding are presented. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS).

  9. Exploring Causal Models of Educational Achievement.

    ERIC Educational Resources Information Center

    Parkerson, Jo Ann; And Others

    1984-01-01

    This article evaluates five causal model of educational productivity applied to learning science in a sample of 882 fifth through eighth graders. Each model explores the relationship between achievement and a combination of eight constructs: home environment, peer group, media, ability, social environment, time on task, motivation, and…

  10. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  11. Topological BF Theories

    NASA Astrophysics Data System (ADS)

    Sǎraru, Silviu-Constantin

    Topological field theories originate in the papers of Schwarz and Witten. Initially, Schwarz shown that one of the topological invariants, namely the Ray-Singer torsion, can be represented as the partition function of a certain quantum field theory. Subsequently, Witten constructed a framework for understanding Morse theory in terms of supersymmetric quantum mechanics. These two constructions represent the prototypes of all topological field theories. The model used by Witten has been applied to classical index theorems and, moreover, suggested some generalizations that led to new mathematical results on holomorphic Morse inequalities. Starting with these results, further developments in the domain of topological field theories have been achieved. The Becchi-Rouet-Stora-Tyutin (BRST) symmetry allowed for a new definition of topological ...eld theories as theories whose BRST-invariant Hamiltonian is also BRST-exact. An important class of topological theories of Schwarz type is the class of BF models. This type of models describes three-dimensional quantum gravity and is useful at the study of four-dimensional quantum gravity in Ashtekar-Rovelli-Smolin formulation. Two-dimensional BF models are correlated to Poisson sigma models from various two-dimensional gravities. The analysis of Poisson sigma models, including their relationship to two-dimensional gravity and the study of classical solutions, has been intensively studied in the literature. In this thesis we approach the problem of construction of some classes of interacting BF models in the context of the BRST formalism. In view of this, we use the method of the deformation of the BRST charge and BRST-invariant Hamiltonian. Both methods rely on specific techniques of local BRST cohomology. The main hypotheses in which we construct the above mentioned interactions are: space-time locality, Poincare invariance, smoothness of deformations in the coupling constant and the preservation of the number of derivatives on each field. The first two hypotheses implies that the resulting interacting theory must be local in space-time and Poincare invariant. The smoothness of deformations means that the deformed objects that contribute to the construction of interactions must be smooth in the coupling constant and reduce to the objects corresponding to the free theory in the zero limit of the coupling constant. The preservation of the number of derivatives on each field imp! lies two aspects that must be simultaneously fulfilled: (i) the differential order of each free field equation must coincide with that of the corresponding interacting field equation; (ii) the maximum number of space-time derivatives from the interacting vertices cannot exceed the maximum number of derivatives from the free Lagrangian. The main results obtained can be synthesized into: obtaining self-interactions for certain classes of BF models; generation of couplings between some classes of BF theories and matter theories; construction of interactions between a class of BF models and a system of massless vector fields.

  12. Constructing Optimal Coarse-Grained Sites of Huge Biomolecules by Fluctuation Maximization.

    PubMed

    Li, Min; Zhang, John Zenghui; Xia, Fei

    2016-04-12

    Coarse-grained (CG) models are valuable tools for the study of functions of large biomolecules on large length and time scales. The definition of CG representations for huge biomolecules is always a formidable challenge. In this work, we propose a new method called fluctuation maximization coarse-graining (FM-CG) to construct the CG sites of biomolecules. The defined residual in FM-CG converges to a maximal value as the number of CG sites increases, allowing an optimal CG model to be rigorously defined on the basis of the maximum. More importantly, we developed a robust algorithm called stepwise local iterative optimization (SLIO) to accelerate the process of coarse-graining large biomolecules. By means of the efficient SLIO algorithm, the computational cost of coarse-graining large biomolecules is reduced to within the time scale of seconds, which is far lower than that of conventional simulated annealing. The coarse-graining of two huge systems, chaperonin GroEL and lengsin, indicates that our new methods can coarse-grain huge biomolecular systems with up to 10,000 residues within the time scale of minutes. The further parametrization of CG sites derived from FM-CG allows us to construct the corresponding CG models for studies of the functions of huge biomolecular systems.

  13. Effects of low-temperature hydrogen peroxide gas plasma sterilization on in vitro cytotoxicity of poly(ϵ-caprolactone) (PCL).

    PubMed

    Franklin, Samuel Patrick; Stoker, Aaron M; Cockrell, Mary K; Pfeiffer, Ferris M; Sonny Bal, B; Cook, James L

    2012-01-01

    Our objective was to determine whether low-temperature hydrogen peroxide (H2O2) gas plasma sterilization of porous three-dimensional poly(ϵ-caprolactone) (PCL) constructs significantly inhibits cellular metabolism of canine chondrocytes. Porous cylindrical constructs were fabricated using fused deposition modeling and divided into four sterilization groups. Two groups were sterilized with low-temperature H2O2 gas plasma (LTGP) and constructs from one of those groups were subsequently rinsed with Dulbecco's Modified Essential Media (LTGPDM). Constructs in the other two groups were disinfected with either 70% isopropyl alcohol or exposure to UV light. Canine chondrocytes were seeded in 6-well tissue-culture plates and allowed to adhere prior to addition of PCL. Cellular metabolism was assessed by adding resazurin to the tissue-culture wells and assessing conversion of this substrate by viable cells to the fluorescent die resorufin. This process was performed at three times prior to addition of PCL and at four times after addition of PCL to the tissue-culture wells. Metabolism was not significantly different among the different tissue-culture wells at any of the 3 times prior to addition of PCL. Metabolism was significantly different among the treatment groups at 3 of 4 times after addition of PCL to the tissue culture wells. Metabolism was significantly lower with constructs sterilized by LTGP than all other treatment groups at all 3 of these times. We conclude that LTGP sterilization of PCL constructs resulted in significant cytotoxicity to canine chondrocytes when compared to PCL constructs disinfected with either UV light exposure or 70% isopropyl alcohol.

  14. An efficient two-stage approach for image-based FSI analysis of atherosclerotic arteries

    PubMed Central

    Rayz, Vitaliy L.; Mofrad, Mohammad R. K.; Saloner, David

    2010-01-01

    Patient-specific biomechanical modeling of atherosclerotic arteries has the potential to aid clinicians in characterizing lesions and determining optimal treatment plans. To attain high levels of accuracy, recent models use medical imaging data to determine plaque component boundaries in three dimensions, and fluid–structure interaction is used to capture mechanical loading of the diseased vessel. As the plaque components and vessel wall are often highly complex in shape, constructing a suitable structured computational mesh is very challenging and can require a great deal of time. Models based on unstructured computational meshes require relatively less time to construct and are capable of accurately representing plaque components in three dimensions. These models unfortunately require additional computational resources and computing time for accurate and meaningful results. A two-stage modeling strategy based on unstructured computational meshes is proposed to achieve a reasonable balance between meshing difficulty and computational resource and time demand. In this method, a coarsegrained simulation of the full arterial domain is used to guide and constrain a fine-scale simulation of a smaller region of interest within the full domain. Results for a patient-specific carotid bifurcation model demonstrate that the two-stage approach can afford a large savings in both time for mesh generation and time and resources needed for computation. The effects of solid and fluid domain truncation were explored, and were shown to minimally affect accuracy of the stress fields predicted with the two-stage approach. PMID:19756798

  15. Comparing Within-Person Effects from Multivariate Longitudinal Models

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Howard, Andrea L.

    2016-01-01

    Several multivariate models are motivated to answer similar developmental questions regarding within-person (intraindividual) effects between 2 or more constructs over time, yet the within-person effects tested by each model are distinct. In this article, the authors clarify the types of within-person inferences that can be made from each model.…

  16. Compilation of a near-infrared library for the construction of quantitative models of amoxicillin and potassium clavulanate oral dosage forms

    NASA Astrophysics Data System (ADS)

    Zou, Wen-bo; Chong, Xiao-meng; Wang, Yan; Hu, Chang-qin

    2018-05-01

    The accuracy of NIR quantitative models depends on calibration samples with concentration variability. Conventional sample collecting methods have some shortcomings especially the time-consuming which remains a bottleneck in the application of NIR models for Process Analytical Technology (PAT) control. A study was performed to solve the problem of sample selection collection for construction of NIR quantitative models. Amoxicillin and potassium clavulanate oral dosage forms were used as examples. The aim was to find a normal approach to rapidly construct NIR quantitative models using an NIR spectral library based on the idea of a universal model [2021]. The NIR spectral library of amoxicillin and potassium clavulanate oral dosage forms was defined and consisted of spectra of 377 batches of samples produced by 26 domestic pharmaceutical companies, including tablets, dispersible tablets, chewable tablets, oral suspensions, and granules. The correlation coefficient (rT) was used to indicate the similarities of the spectra. The samples’ calibration sets were selected from a spectral library according to the median rT of the samples to be analyzed. The rT of the samples selected was close to the median rT. The difference in rT of those samples was 1.0% to 1.5%. We concluded that sample selection is not a problem when constructing NIR quantitative models using a spectral library versus conventional methods of determining universal models. The sample spectra with a suitable concentration range in the NIR models were collected quickly. In addition, the models constructed through this method were more easily targeted.

  17. A pharmacokinetic model of filgrastim and pegfilgrastim application in normal mice and those with cyclophosphamide-induced granulocytopaenia.

    PubMed

    Scholz, M; Ackermann, M; Engel, C; Emmrich, F; Loeffler, M; Kamprad, M

    2009-12-01

    Recombinant human granulocyte colony-stimulating factor (rhG-CSF) is widely used as treatment for granulocytopaenia during cytotoxic chemotherapy; however, optimal scheduling of this pharmaceutical is unknown. Biomathematical models can help to pre-select optimal application schedules but precise pharmacokinetic properties of the pharmaceuticals are required at first. In this study, we have aimed to construct a pharmacokinetic model of G-CSF derivatives filgrastim and pegfilgrastim in mice. Healthy CD-1 mice and those with cyclophosphamide-induced granulocytopaenia were studied after administration of filgrastim and pegfilgrastim in different dosing and timing schedules. Close meshed time series of granulocytes and G-CSF plasma concentrations were determined. An ordinary differential equations model of pharmacokinetics was constructed on the basis of known mechanisms of drug distribution and degradation. Predictions of the model fit well with all experimental data for both filgrastim and pegfilgrastim. We obtained a unique parameter setting for all experimental scenarios. Differences in pharmacokinetics between filgrastim and pegfilgrastim can be explained by different estimates of model parameters rather than by different model mechanisms. Parameter estimates with respect to distribution and clearance of the drug derivatives are in agreement with qualitative experimental results. Dynamics of filgrastim and pegfilgrastim plasma levels can be explained by the same pharmacokinetic model but different model parameters. Beause of a strong clearance mechanism mediated by granulocytes, granulocytotic and granulocytopaenic conditions must be studied simultaneously to construct a reliable model. The pharmacokinetic model will be extended to a murine model of granulopoiesis under chemotherapy and G-CSF application.

  18. Different or Similar: Constructions of Leadership by Senior Managers in Irish and Portuguese Universities

    ERIC Educational Resources Information Center

    O'Connor, Pat; Carvalho, Teresa

    2015-01-01

    Despite over 60 years of research on leadership, few attempts have been made to ensure that the models of leadership are inclusive of women or other "outsiders". This paper explores variation in the constructions of leadership at a time of institutional change in higher education. Drawing on a purposive sample, including those at…

  19. An accurate computational method for an order parameter with a Markov state model constructed using a manifold-learning technique

    NASA Astrophysics Data System (ADS)

    Ito, Reika; Yoshidome, Takashi

    2018-01-01

    Markov state models (MSMs) are a powerful approach for analyzing the long-time behaviors of protein motion using molecular dynamics simulation data. However, their quantitative performance with respect to the physical quantities is poor. We believe that this poor performance is caused by the failure to appropriately classify protein conformations into states when constructing MSMs. Herein, we show that the quantitative performance of an order parameter is improved when a manifold-learning technique is employed for the classification in the MSM. The MSM construction using the K-center method, which has been previously used for classification, has a poor quantitative performance.

  20. IoGET: Internet of Geophysical and Environmental Things

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mudunuru, Maruti Kumar

    The objective of this project is to provide novel and fast reduced-order models for onboard computation at sensor nodes for real-time analysis. The approach will require that LANL perform high-fidelity numerical simulations, construct simple reduced-order models (ROMs) using machine learning and signal processing algorithms, and use real-time data analysis for ROMs and compressive sensing at sensor nodes.

  1. Modeling and forecasting of KLCI weekly return using WT-ANN integrated model

    NASA Astrophysics Data System (ADS)

    Liew, Wei-Thong; Liong, Choong-Yeun; Hussain, Saiful Izzuan; Isa, Zaidi

    2013-04-01

    The forecasting of weekly return is one of the most challenging tasks in investment since the time series are volatile and non-stationary. In this study, an integrated model of wavelet transform and artificial neural network, WT-ANN is studied for modeling and forecasting of KLCI weekly return. First, the WT is applied to decompose the weekly return time series in order to eliminate noise. Then, a mathematical model of the time series is constructed using the ANN. The performance of the suggested model will be evaluated by root mean squared error (RMSE), mean absolute error (MAE), mean absolute percentage error (MAPE). The result shows that the WT-ANN model can be considered as a feasible and powerful model for time series modeling and prediction.

  2. A new class of enhanced kinetic sampling methods for building Markov state models

    NASA Astrophysics Data System (ADS)

    Bhoutekar, Arti; Ghosh, Susmita; Bhattacharya, Swati; Chatterjee, Abhijit

    2017-10-01

    Markov state models (MSMs) and other related kinetic network models are frequently used to study the long-timescale dynamical behavior of biomolecular and materials systems. MSMs are often constructed bottom-up using brute-force molecular dynamics (MD) simulations when the model contains a large number of states and kinetic pathways that are not known a priori. However, the resulting network generally encompasses only parts of the configurational space, and regardless of any additional MD performed, several states and pathways will still remain missing. This implies that the duration for which the MSM can faithfully capture the true dynamics, which we term as the validity time for the MSM, is always finite and unfortunately much shorter than the MD time invested to construct the model. A general framework that relates the kinetic uncertainty in the model to the validity time, missing states and pathways, network topology, and statistical sampling is presented. Performing additional calculations for frequently-sampled states/pathways may not alter the MSM validity time. A new class of enhanced kinetic sampling techniques is introduced that aims at targeting rare states/pathways that contribute most to the uncertainty so that the validity time is boosted in an effective manner. Examples including straightforward 1D energy landscapes, lattice models, and biomolecular systems are provided to illustrate the application of the method. Developments presented here will be of interest to the kinetic Monte Carlo community as well.

  3. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    PubMed

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes.

  4. Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation.

    PubMed

    Tsao, Liuxing; Ma, Liang

    2016-11-01

    Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.

  5. Bivariate analysis of floods in climate impact assessments.

    PubMed

    Brunner, Manuela Irene; Sikorska, Anna E; Seibert, Jan

    2018-03-01

    Climate impact studies regarding floods usually focus on peak discharges and a bivariate assessment of peak discharges and hydrograph volumes is not commonly included. A joint consideration of peak discharges and hydrograph volumes, however, is crucial when assessing flood risks for current and future climate conditions. Here, we present a methodology to develop synthetic design hydrographs for future climate conditions that jointly consider peak discharges and hydrograph volumes. First, change factors are derived based on a regional climate model and are applied to observed precipitation and temperature time series. Second, the modified time series are fed into a calibrated hydrological model to simulate runoff time series for future conditions. Third, these time series are used to construct synthetic design hydrographs. The bivariate flood frequency analysis used in the construction of synthetic design hydrographs takes into account the dependence between peak discharges and hydrograph volumes, and represents the shape of the hydrograph. The latter is modeled using a probability density function while the dependence between the design variables peak discharge and hydrograph volume is modeled using a copula. We applied this approach to a set of eight mountainous catchments in Switzerland to construct catchment-specific and season-specific design hydrographs for a control and three scenario climates. Our work demonstrates that projected climate changes have an impact not only on peak discharges but also on hydrograph volumes and on hydrograph shapes both at an annual and at a seasonal scale. These changes are not necessarily proportional which implies that climate impact assessments on future floods should consider more flood characteristics than just flood peaks. Copyright © 2017. Published by Elsevier B.V.

  6. Atomistic and coarse-grained computer simulations of raft-like lipid mixtures.

    PubMed

    Pandit, Sagar A; Scott, H Larry

    2007-01-01

    Computer modeling can provide insights into the existence, structure, size, and thermodynamic stability of localized raft-like regions in membranes. However, the challenges in the construction and simulation of accurate models of heterogeneous membranes are great. The primary obstacle in modeling the lateral organization within a membrane is the relatively slow lateral diffusion rate for lipid molecules. Microsecond or longer time-scales are needed to fully model the formation and stability of a raft in a membra ne. Atomistic simulations currently are not able to reach this scale, but they do provide quantitative information on the intermolecular forces and correlations that are involved in lateral organization. In this chapter, the steps needed to carry out and analyze atomistic simulations of hydrated lipid bilayers having heterogeneous composition are outlined. It is then shown how the data from a molecular dynamics simulation can be used to construct a coarse-grained model for the heterogeneous bilayer that can predict the lateral organization and stability of rafts at up to millisecond time-scales.

  7. Cash transportation vehicle routing and scheduling under stochastic travel times

    NASA Astrophysics Data System (ADS)

    Yan, Shangyao; Wang, Sin-Siang; Chang, Yu-Hsuan

    2014-03-01

    Stochastic disturbances occurring in real-world operations could have a significant influence on the planned routing and scheduling results of cash transportation vehicles. In this study, a time-space network flow technique is utilized to construct a cash transportation vehicle routing and scheduling model incorporating stochastic travel times. In addition, to help security carriers to formulate more flexible routes and schedules, a concept of the similarity of time and space for vehicle routing and scheduling is incorporated into the model. The test results show that the model could be useful for security carriers in actual practice.

  8. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  9. Modeling multiple time series annotations as noisy distortions of the ground truth: An Expectation-Maximization approach.

    PubMed

    Gupta, Rahul; Audhkhasi, Kartik; Jacokes, Zach; Rozga, Agata; Narayanan, Shrikanth

    2018-01-01

    Studies of time-continuous human behavioral phenomena often rely on ratings from multiple annotators. Since the ground truth of the target construct is often latent, the standard practice is to use ad-hoc metrics (such as averaging annotator ratings). Despite being easy to compute, such metrics may not provide accurate representations of the underlying construct. In this paper, we present a novel method for modeling multiple time series annotations over a continuous variable that computes the ground truth by modeling annotator specific distortions. We condition the ground truth on a set of features extracted from the data and further assume that the annotators provide their ratings as modification of the ground truth, with each annotator having specific distortion tendencies. We train the model using an Expectation-Maximization based algorithm and evaluate it on a study involving natural interaction between a child and a psychologist, to predict confidence ratings of the children's smiles. We compare and analyze the model against two baselines where: (i) the ground truth in considered to be framewise mean of ratings from various annotators and, (ii) each annotator is assumed to bear a distinct time delay in annotation and their annotations are aligned before computing the framewise mean.

  10. Brownian motion with adaptive drift for remaining useful life prediction: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Tsui, Kwok-Leung

    2018-01-01

    Linear Brownian motion with constant drift is widely used in remaining useful life predictions because its first hitting time follows the inverse Gaussian distribution. State space modelling of linear Brownian motion was proposed to make the drift coefficient adaptive and incorporate on-line measurements into the first hitting time distribution. Here, the drift coefficient followed the Gaussian distribution, and it was iteratively estimated by using Kalman filtering once a new measurement was available. Then, to model nonlinear degradation, linear Brownian motion with adaptive drift was extended to nonlinear Brownian motion with adaptive drift. However, in previous studies, an underlying assumption used in the state space modelling was that in the update phase of Kalman filtering, the predicted drift coefficient at the current time exactly equalled the posterior drift coefficient estimated at the previous time, which caused a contradiction with the predicted drift coefficient evolution driven by an additive Gaussian process noise. In this paper, to alleviate such an underlying assumption, a new state space model is constructed. As a result, in the update phase of Kalman filtering, the predicted drift coefficient at the current time evolves from the posterior drift coefficient at the previous time. Moreover, the optimal Kalman filtering gain for iteratively estimating the posterior drift coefficient at any time is mathematically derived. A discussion that theoretically explains the main reasons why the constructed state space model can result in high remaining useful life prediction accuracies is provided. Finally, the proposed state space model and its associated Kalman filtering gain are applied to battery prognostics.

  11. 3D-printed soft-tissue physical models of renal malignancies for individualized surgical simulation: a feasibility study.

    PubMed

    Maddox, Michael M; Feibus, Allison; Liu, James; Wang, Julie; Thomas, Raju; Silberstein, Jonathan L

    2018-03-01

    To construct patient-specific physical three-dimensional (3D) models of renal units with materials that approximates the properties of renal tissue to allow pre-operative and robotic training surgical simulation, 3D physical kidney models were created (3DSystems, Rock Hill, SC) using computerized tomography to segment structures of interest (parenchyma, vasculature, collection system, and tumor). Images were converted to a 3D surface mesh file for fabrication using a multi-jet 3D printer. A novel construction technique was employed to approximate normal renal tissue texture, printers selectively deposited photopolymer material forming the outer shell of the kidney, and subsequently, an agarose gel solution was injected into the inner cavity recreating the spongier renal parenchyma. We constructed seven models of renal units with suspected malignancies. Partial nephrectomy and renorrhaphy were performed on each of the replicas. Subsequently all patients successfully underwent robotic partial nephrectomy. Average tumor diameter was 4.4 cm, warm ischemia time was 25 min, RENAL nephrometry score was 7.4, and surgical margins were negative. A comparison was made between the seven cases and the Tulane Urology prospectively maintained robotic partial nephrectomy database. Patients with surgical models had larger tumors, higher nephrometry score, longer warm ischemic time, fewer positive surgical margins, shorter hospitalization, and fewer post-operative complications; however, the only significant finding was lower estimated blood loss (186 cc vs 236; p = 0.01). In this feasibility study, pre-operative resectable physical 3D models can be constructed and used as patient-specific surgical simulation tools; further study will need to demonstrate if this results in improvement of surgical outcomes and robotic simulation education.

  12. BioNetSim: a Petri net-based modeling tool for simulations of biochemical processes.

    PubMed

    Gao, Junhui; Li, Li; Wu, Xiaolin; Wei, Dong-Qing

    2012-03-01

    BioNetSim, a Petri net-based software for modeling and simulating biochemistry processes, is developed, whose design and implement are presented in this paper, including logic construction, real-time access to KEGG (Kyoto Encyclopedia of Genes and Genomes), and BioModel database. Furthermore, glycolysis is simulated as an example of its application. BioNetSim is a helpful tool for researchers to download data, model biological network, and simulate complicated biochemistry processes. Gene regulatory networks, metabolic pathways, signaling pathways, and kinetics of cell interaction are all available in BioNetSim, which makes modeling more efficient and effective. Similar to other Petri net-based softwares, BioNetSim does well in graphic application and mathematic construction. Moreover, it shows several powerful predominances. (1) It creates models in database. (2) It realizes the real-time access to KEGG and BioModel and transfers data to Petri net. (3) It provides qualitative analysis, such as computation of constants. (4) It generates graphs for tracing the concentration of every molecule during the simulation processes.

  13. The role of production and teamwork practices in construction safety: a cognitive model and an empirical case study.

    PubMed

    Mitropoulos, Panagiotis Takis; Cupido, Gerardo

    2009-01-01

    In construction, the challenge for researchers and practitioners is to develop work systems (production processes and teams) that can achieve high productivity and high safety at the same time. However, construction accident causation models ignore the role of work practices and teamwork. This study investigates the mechanisms by which production and teamwork practices affect the likelihood of accidents. The paper synthesizes a new model for construction safety based on the cognitive perspective (Fuller's Task-Demand-Capability Interface model, 2005) and then presents an exploratory case study. The case study investigates and compares the work practices of two residential framing crews: a 'High Reliability Crew' (HRC)--that is, a crew with exceptional productivity and safety over several years, and an average performing crew from the same company. The model explains how the production and teamwork practices generate the work situations that workers face (the task demands) and affect the workers ability to cope (capabilities). The case study indicates that the work practices of the HRC directly influence the task demands and match them with the applied capabilities. These practices were guided by the 'principle' of avoiding errors and rework and included work planning and preparation, work distribution, managing the production pressures, and quality and behavior monitoring. The Task Demand-Capability model links construction research to a cognitive model of accident causation and provides a new way to conceptualize safety as an emergent property of the production practices and teamwork processes. The empirical evidence indicates that the crews' work practices and team processes strongly affect the task demands, the applied capabilities, and the match between demands and capabilities. The proposed model and the exploratory case study will guide further discovery of work practices and teamwork processes that can increase both productivity and safety in construction operations. Such understanding will enable training of construction foremen and crews in these practices to systematically develop high reliability crews.

  14. A modular method for evaluating the performance of picture archiving and communication systems.

    PubMed

    Sanders, W H; Kant, L A; Kudrimoti, A

    1993-08-01

    Modeling can be used to predict the performance of picture archiving and communication system (PACS) configurations under various load conditions at an early design stage. This is important because choices made early in the design of a system can have a significant impact on the performance of the resulting implementation. Because PACS consist of many types of components, it is important to do such evaluations in a modular manner, so that alternative configurations and designs can be easily investigated. Stochastic activity networks (SANs) and reduced base model construction methods can aid in doing this. SANs are a model type particularly suited to the evaluation of systems in which several activities may be in progress concurrently, and each activity may affect the others through the results of its completion. Together with SANs, reduced base model construction methods provide a means to build highly modular models, in which models of particular components can be easily reused. In this article, we investigate the use of SANs and reduced base model construction techniques in evaluating PACS. Construction and solution of the models is done using UltraSAN, a graphic-oriented software tool for model specification, analysis, and simulation. The method is illustrated via the evaluation of a realistically sized PACS for a typical United States hospital of 300 to 400 beds, and the derivation of system response times and component utilizations.

  15. VE at Scope Time (VEST): Three construction examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sperling, R.B.

    1991-04-01

    Value Engineering at Scope Time (VEST)'' was published in Value World, January-February-March 1991. That article describes VEST as a four-phase process utilizing the heart'' of VE methodology, which is designed to be used with members of construction design teams to help them focus on the scope of work by doing cost modeling, function analysis, brainstorming and evaluation of ideas. With minimal training designers, architects and engineers can become energized to find creative design solutions and learn an effective, synergistic team approach to facilities design projects using VEST. If time is available, the team can begin the development of some highermore » ranked ideas into preliminary proposals. This paper is an expansion of that article, adding a brief section on training and presenting three examples of VEST on construction projects at a federally-funded research Laboratory.« less

  16. Atomic density functional and diagram of structures in the phase field crystal model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ankudinov, V. E., E-mail: vladimir@ankudinov.org; Galenko, P. K.; Kropotin, N. V.

    2016-02-15

    The phase field crystal model provides a continual description of the atomic density over the diffusion time of reactions. We consider a homogeneous structure (liquid) and a perfect periodic crystal, which are constructed from the one-mode approximation of the phase field crystal model. A diagram of 2D structures is constructed from the analytic solutions of the model using atomic density functionals. The diagram predicts equilibrium atomic configurations for transitions from the metastable state and includes the domains of existence of homogeneous, triangular, and striped structures corresponding to a liquid, a body-centered cubic crystal, and a longitudinal cross section of cylindricalmore » tubes. The method developed here is employed for constructing the diagram for the homogeneous liquid phase and the body-centered iron lattice. The expression for the free energy is derived analytically from density functional theory. The specific features of approximating the phase field crystal model are compared with the approximations and conclusions of the weak crystallization and 2D melting theories.« less

  17. An optimal generic model for multi-parameters and big data optimizing: a laboratory experimental study

    NASA Astrophysics Data System (ADS)

    Utama, D. N.; Ani, N.; Iqbal, M. M.

    2018-03-01

    Optimization is a process for finding parameter (parameters) that is (are) able to deliver an optimal value for an objective function. Seeking an optimal generic model for optimizing is a computer science study that has been being practically conducted by numerous researchers. Generic model is a model that can be technically operated to solve any varieties of optimization problem. By using an object-oriented method, the generic model for optimizing was constructed. Moreover, two types of optimization method, simulated-annealing and hill-climbing, were functioned in constructing the model and compared to find the most optimal one then. The result said that both methods gave the same result for a value of objective function and the hill-climbing based model consumed the shortest running time.

  18. A novel track-before-detect algorithm based on optimal nonlinear filtering for detecting and tracking infrared dim target

    NASA Astrophysics Data System (ADS)

    Tian, Yuexin; Gao, Kun; Liu, Ying; Han, Lu

    2015-08-01

    Aiming at the nonlinear and non-Gaussian features of the real infrared scenes, an optimal nonlinear filtering based algorithm for the infrared dim target tracking-before-detecting application is proposed. It uses the nonlinear theory to construct the state and observation models and uses the spectral separation scheme based Wiener chaos expansion method to resolve the stochastic differential equation of the constructed models. In order to improve computation efficiency, the most time-consuming operations independent of observation data are processed on the fore observation stage. The other observation data related rapid computations are implemented subsequently. Simulation results show that the algorithm possesses excellent detection performance and is more suitable for real-time processing.

  19. Examining the individual and perceived neighborhood associations of leisure-time physical activity in persons with spinal cord injury.

    PubMed

    Arbour-Nicitopoulos, Kelly P; Martin Ginis, Kathleen A; Wilson, Philip M

    2010-05-01

    Theory of Planned Behavior (TPB) constructs have been shown to be useful for explaining leisure-time physical activity (LTPA) in persons with spinal cord injury (SCI). However, other factors not captured by the TPB may also be important predictors of LTPA for this population. The purpose of this study is to examine the role of neighborhood perceptions within the context of the TPB for understanding LTPA in persons living with SCI. This is a cross-sectional analysis (n = 574) using structural equation modeling involving measures of the TPB constructs, perceived neighborhood esthetics and sidewalks, and LTPA. TPB constructs explained 57% of the variance in intentions and 12% of the variance in behavior. Inclusion of the neighborhood variables to the model resulted in an additional 1% of the variance explained in intentions, with esthetics exhibiting significant positive relationships with the TPB variables. Integrating perceived neighborhood esthetics into the TPB framework provides additional understanding of LTPA intentions in persons living with SCI.

  20. A Study on Ganui-Dae's External Form and Its Modeling for Ganui-Dae's External Form and Its Modeling for Restoration

    NASA Astrophysics Data System (ADS)

    Lee, Min-Soo; Lee, Yong Sam; Jeon, Jun Hyeok; Kim, Sang Hyuk

    2013-12-01

    Ganui-Dae, built in the reign of King Sejong, Joseon Dynasty, is a comprehensive observatory. It has various instruments for observation and time signal such as Ganui, Gyupyo(Gnomon), water-hammering type Honui and Honsang, and so on. Studying on Ganui-Dae has been focused on its location, history, criterion, etc, so far. However, studying on its external form and construction method has been conducted insufficiently. This study suggests the model for restoration of Ganui-Dae. The model is based on the analysis about external form of Ganui-Dae in various antique maps, and its construction method in those days.

  1. Construction costs, payback times, and the leaf economics of carnivorous plants.

    PubMed

    Karagatzides, Jim D; Ellison, Aaron M

    2009-09-01

    Understanding how different plant species and functional types "invest" carbon and nutrients is a major goal of plant ecologists. Two measures of such investments are "construction costs" (carbon needed to produce each gram of tissue) and associated "payback times" for photosynthesis to recover construction costs. These measurements integrate among traits used to assess leaf-trait scaling relationships. Carnivorous plants are model systems for examining mechanisms of leaf-trait coordination, but no studies have measured simultaneously construction costs of carnivorous traps and their photosynthetic rates to determine payback times of traps. We measured mass-based construction costs (CC(mass)) and photosynthesis (A(mass)) for traps, leaves, roots, and rhizomes of 15 carnivorous plant species grown under greenhouse conditions. There were highly significant differences among species in CC(mass) for each structure. Mean CC(mass) of carnivorous traps (1.14 ± 0.24 g glucose/g dry mass) was significantly lower than CC(mass) of leaves of 267 noncarnivorous plant species (1.47 ± 0.17), but all carnivorous plants examined had very low A(mass) and thus, long payback times (495-1551 h). Our results provide the first clear estimates of the marginal benefits of botanical carnivory and place carnivorous plants at the "slow and tough" end of the universal spectrum of leaf traits.

  2. A 3D Geometry Model Search Engine to Support Learning

    ERIC Educational Resources Information Center

    Tam, Gary K. L.; Lau, Rynson W. H.; Zhao, Jianmin

    2009-01-01

    Due to the popularity of 3D graphics in animation and games, usage of 3D geometry deformable models increases dramatically. Despite their growing importance, these models are difficult and time consuming to build. A distance learning system for the construction of these models could greatly facilitate students to learn and practice at different…

  3. 40 CFR 60.1570 - What is the “model rule” in this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Use of Model Rule § 60.1570 What is the “model rule” in this subpart? (a) The model rule is the portion of the...

  4. Program Helps Generate Boundary-Element Mathematical Models

    NASA Technical Reports Server (NTRS)

    Goldberg, R. K.

    1995-01-01

    Composite Model Generation-Boundary Element Method (COM-GEN-BEM) computer program significantly reduces time and effort needed to construct boundary-element mathematical models of continuous-fiber composite materials at micro-mechanical (constituent) scale. Generates boundary-element models compatible with BEST-CMS boundary-element code for anlaysis of micromechanics of composite material. Written in PATRAN Command Language (PCL).

  5. High-fidelity meshes from tissue samples for diffusion MRI simulations.

    PubMed

    Panagiotaki, Eleftheria; Hall, Matt G; Zhang, Hui; Siow, Bernard; Lythgoe, Mark F; Alexander, Daniel C

    2010-01-01

    This paper presents a method for constructing detailed geometric models of tissue microstructure for synthesizing realistic diffusion MRI data. We construct three-dimensional mesh models from confocal microscopy image stacks using the marching cubes algorithm. Random-walk simulations within the resulting meshes provide synthetic diffusion MRI measurements. Experiments optimise simulation parameters and complexity of the meshes to achieve accuracy and reproducibility while minimizing computation time. Finally we assess the quality of the synthesized data from the mesh models by comparison with scanner data as well as synthetic data from simple geometric models and simplified meshes that vary only in two dimensions. The results support the extra complexity of the three-dimensional mesh compared to simpler models although sensitivity to the mesh resolution is quite robust.

  6. A Model of Contextual Motivation in Physical Education: Using Constructs from Self-Determination and Achievement Goal Theories To Predict Physical Activity Intentions.

    ERIC Educational Resources Information Center

    Standage, Martyn; Duda, Joan L.; Ntoumanis, Nikos

    2003-01-01

    Examines a study of student motivation in physical education that incorporated constructs from achievement goal and self-determination theories. Self-determined motivation was found to positively predict, whereas amotivation was a negative predictor of leisure-time physical activity intentions. (Contains 86 references and 3 tables.) (GCP)

  7. Reciprocal Influences Between Maternal Parenting and Child Adjustment in a High-risk Population: A Five-Year Cross-Lagged Analysis of Bidirectional Effects

    PubMed Central

    Barbot, Baptiste; Crossman, Elizabeth; Hunter, Scott R.; Grigorenko, Elena L.; Luthar, Suniya S.

    2014-01-01

    This study examines longitudinally the bidirectional influences between maternal parenting (behaviors and parenting stress) and mothers' perceptions of their children's adjustment, in a multivariate approach. Data was gathered from 361 low-income mothers (many with psychiatric diagnoses) reporting on their parenting behavior, parenting stress and their child's adjustment, in a two-wave longitudinal study over 5 years. Measurement models were developed to derive four broad parenting constructs (Involvement, Control, Rejection, and Stress) and three child adjustment constructs (Internalizing problems, Externalizing problems, and Social competence). After measurement invariance of these constructs was confirmed across relevant groups and over time, both measurement models were integrated in a single crossed-lagged regression analysis of latent constructs. Multiple reciprocal influence were observed between parenting and perceived child adjustment over time: Externalizing and internalizing problems in children were predicted by baseline maternal parenting behaviors, while child social competence was found to reduce parental stress and increase parental involvement and appropriate monitoring. These findings on the motherhood experience are discussed in light of recent research efforts to understand mother-child bi-directional influences, and their potential for practical applications. PMID:25089759

  8. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  9. Measurement invariance, the lack thereof, and modeling change.

    PubMed

    Edwards, Michael C; Houts, Carrie R; Wirth, R J

    2017-08-17

    Measurement invariance issues should be considered during test construction. In this paper, we provide a conceptual overview of measurement invariance and describe how the concept is implemented in several different statistical approaches. Typical applications look for invariance over things such as mode of administration (paper and pencil vs. computer based), language/translation, age, time, and gender, to cite just a few examples. To the extent that the relationships between items and constructs are stable/invariant, we can be more confident in score interpretations. A series of simulated examples are reported which highlight different kinds of non-invariance, the impact it can have, and the effect of appropriately modeling a lack of invariance. One example focuses on the longitudinal context, where measurement invariance is critical to understanding trends over time. Software syntax is provided to help researchers apply these models with their own data. The simulation studies demonstrate the negative impact an erroneous assumption of invariance may have on scores and substantive conclusions drawn from naively analyzing those scores. Measurement invariance implies that the links between the items and the construct of interest are invariant over some domain, grouping, or classification. Examining a new or existing test for measurement invariance should be part of any test construction/implementation plan. In addition to reviewing implications of the simulation study results, we also provide a discussion of the limitations of current approaches and areas in need of additional research.

  10. Development of a Decision Model for Selection of Appropriate Timely Delivery Techniques for Highway Projects

    DOT National Transportation Integrated Search

    2009-04-01

    "The primary umbrella method used by the Oregon Department of Transportation (ODOT) to ensure on-time performance in standard construction contracting is liquidated damages. The assessment value is usually a matter of some judgment. In practice...

  11. Memory-Scalable GPU Spatial Hierarchy Construction.

    PubMed

    Qiming Hou; Xin Sun; Kun Zhou; Lauterbach, C; Manocha, D

    2011-04-01

    Recent GPU algorithms for constructing spatial hierarchies have achieved promising performance for moderately complex models by using the breadth-first search (BFS) construction order. While being able to exploit the massive parallelism on the GPU, the BFS order also consumes excessive GPU memory, which becomes a serious issue for interactive applications involving very complex models with more than a few million triangles. In this paper, we propose to use the partial breadth-first search (PBFS) construction order to control memory consumption while maximizing performance. We apply the PBFS order to two hierarchy construction algorithms. The first algorithm is for kd-trees that automatically balances between the level of parallelism and intermediate memory usage. With PBFS, peak memory consumption during construction can be efficiently controlled without costly CPU-GPU data transfer. We also develop memory allocation strategies to effectively limit memory fragmentation. The resulting algorithm scales well with GPU memory and constructs kd-trees of models with millions of triangles at interactive rates on GPUs with 1 GB memory. Compared with existing algorithms, our algorithm is an order of magnitude more scalable for a given GPU memory bound. The second algorithm is for out-of-core bounding volume hierarchy (BVH) construction for very large scenes based on the PBFS construction order. At each iteration, all constructed nodes are dumped to the CPU memory, and the GPU memory is freed for the next iteration's use. In this way, the algorithm is able to build trees that are too large to be stored in the GPU memory. Experiments show that our algorithm can construct BVHs for scenes with up to 20 M triangles, several times larger than previous GPU algorithms.

  12. Modeling of thin-film GaAs growth

    NASA Technical Reports Server (NTRS)

    Heinbockel, J. H.

    1981-01-01

    A solid Monte Carlo model is constructed for the simulation of crystal growth. The model assumes thermally accommodated adatoms impinge upon the surface during a delta time interval. The surface adatoms are assigned a random energy from a Boltzmann distribution, and this energy determines whether the adatoms evaporate, migrate, or remain stationary during the delta time interval. For each addition or migration of an adatom, potential wells are adjusted to reflect the absorption, migration, or desorption potential changes.

  13. The dynamic financial distress prediction method of EBW-VSTW-SVM

    NASA Astrophysics Data System (ADS)

    Sun, Jie; Li, Hui; Chang, Pei-Chann; He, Kai-Yu

    2016-07-01

    Financial distress prediction (FDP) takes important role in corporate financial risk management. Most of former researches in this field tried to construct effective static FDP (SFDP) models that are difficult to be embedded into enterprise information systems, because they are based on horizontal data-sets collected outside the modelling enterprise by defining the financial distress as the absolute conditions such as bankruptcy or insolvency. This paper attempts to propose an approach for dynamic evaluation and prediction of financial distress based on the entropy-based weighting (EBW), the support vector machine (SVM) and an enterprise's vertical sliding time window (VSTW). The dynamic FDP (DFDP) method is named EBW-VSTW-SVM, which keeps updating the FDP model dynamically with time goes on and only needs the historic financial data of the modelling enterprise itself and thus is easier to be embedded into enterprise information systems. The DFDP method of EBW-VSTW-SVM consists of four steps, namely evaluation of vertical relative financial distress (VRFD) based on EBW, construction of training data-set for DFDP modelling according to VSTW, training of DFDP model based on SVM and DFDP for the future time point. We carry out case studies for two listed pharmaceutical companies and experimental analysis for some other companies to simulate the sliding of enterprise vertical time window. The results indicated that the proposed approach was feasible and efficient to help managers improve corporate financial management.

  14. Wilson-loop instantons

    NASA Technical Reports Server (NTRS)

    Lee, Kimyeong; Holman, Richard; Kolb, Edward W.

    1987-01-01

    Wilson-loop symmetry breaking is considered on a space-time of the form M4 x K, where M4 is a four-dimensional space-time and K is an internal space with nontrivial and finite fundamental group. It is shown in a simple model that the different vacua obtained by breaking a non-Abelian gauge group by Wilson loops are separated in the space of gauge potentials by a finite energy barrier. An interpolating gauge configuration is then constructed between these vacua and shown to have minimum energy. Finally some implications of this construction are discussed.

  15. Construction of ground-state preserving sparse lattice models for predictive materials simulations

    NASA Astrophysics Data System (ADS)

    Huang, Wenxuan; Urban, Alexander; Rong, Ziqin; Ding, Zhiwei; Luo, Chuan; Ceder, Gerbrand

    2017-08-01

    First-principles based cluster expansion models are the dominant approach in ab initio thermodynamics of crystalline mixtures enabling the prediction of phase diagrams and novel ground states. However, despite recent advances, the construction of accurate models still requires a careful and time-consuming manual parameter tuning process for ground-state preservation, since this property is not guaranteed by default. In this paper, we present a systematic and mathematically sound method to obtain cluster expansion models that are guaranteed to preserve the ground states of their reference data. The method builds on the recently introduced compressive sensing paradigm for cluster expansion and employs quadratic programming to impose constraints on the model parameters. The robustness of our methodology is illustrated for two lithium transition metal oxides with relevance for Li-ion battery cathodes, i.e., Li2xFe2(1-x)O2 and Li2xTi2(1-x)O2, for which the construction of cluster expansion models with compressive sensing alone has proven to be challenging. We demonstrate that our method not only guarantees ground-state preservation on the set of reference structures used for the model construction, but also show that out-of-sample ground-state preservation up to relatively large supercell size is achievable through a rapidly converging iterative refinement. This method provides a general tool for building robust, compressed and constrained physical models with predictive power.

  16. Predictive modelling of Lactobacillus casei KN291 survival in fermented soy beverage.

    PubMed

    Zielińska, Dorota; Dorota, Zielińska; Kołożyn-Krajewska, Danuta; Danuta, Kołożyn-Krajewska; Goryl, Antoni; Antoni, Goryl; Motyl, Ilona

    2014-02-01

    The aim of the study was to construct and verify predictive growth and survival models of a potentially probiotic bacteria in fermented soy beverage. The research material included natural soy beverage (Polgrunt, Poland) and the strain of lactic acid bacteria (LAB) - Lactobacillus casei KN291. To construct predictive models for the growth and survival of L. casei KN291 bacteria in the fermented soy beverage we design an experiment which allowed the collection of CFU data. Fermented soy beverage samples were stored at various temperature conditions (5, 10, 15, and 20°C) for 28 days. On the basis of obtained data concerning the survival of L. casei KN291 bacteria in soy beverage at different temperature and time conditions, two non-linear models (r(2)= 0.68-0.93) and two surface models (r(2)=0.76-0.79) were constructed; these models described the behaviour of the bacteria in the product to a satisfactory extent. Verification of the surface models was carried out utilizing the validation data - at 7°C during 28 days. It was found that applied models were well fitted and charged with small systematic errors, which is evidenced by accuracy factor - Af, bias factor - Bf and mean squared error - MSE. The constructed microbiological growth and survival models of L. casei KN291 in fermented soy beverage enable the estimation of products shelf life period, which in this case is defined by the requirement for the level of the bacteria to be above 10(6) CFU/cm(3). The constructed models may be useful as a tool for the manufacture of probiotic foods to estimate of their shelf life period.

  17. An alternative continence tube for continent urinary reservoirs: evaluation of surgical technique, pressure and continence study in an ex-vivo model.

    PubMed

    Honeck, Patrick; Michel, Maurice Stephan; Trojan, Lutz; Alken, Peter

    2009-02-01

    Despite the large number of surgical techniques for continent cutaneous diversion described in literature, the creation of a reliable, continent and easily catheterizable continence mechanism remains a complex surgical procedure. Aim of this study was the evaluation of a new method for a catheterizable continence mechanism using stapled pig intestine. Small and large pig intestines were used for construction. A 3 or 6 cm double row stapling system was used. Three variations using small and large intestine segments were constructed. A 3 or 6 cm long stapler line was placed alongside a 12 Fr catheter positioned at the antimesenterial side creating a partially two-luminal segment. Construction time for the tube was measured. The created tube was then embedded into the pouch. Pressure evaluation of the continence mechanism was performed for each variation. Intermittent external manual compression was used to simulate sudden pressure exposure. All variations were 100% continent under filling volumes of up to 700 ml and pressure levels of 58 +/- 6 cm H(2)O for large intestine and 266 ml and 87 +/- 18 cm H(2)O for small intestine, respectively. With further filling above the mentioned capacity suture insufficiency occurred but no tube insufficiency. Construction time for all variations was less than 12 min. The described technique is an easy and fast method to construct a continence mechanism using small or large intestine. Our ex vivo experiments have shown sufficient continence situation in an ex-vivo model. Further investigations in an in-vivo model are needed to confirm these results.

  18. Time Delay Embedding Increases Estimation Precision of Models of Intraindividual Variability

    ERIC Educational Resources Information Center

    von Oertzen, Timo; Boker, Steven M.

    2010-01-01

    This paper investigates the precision of parameters estimated from local samples of time dependent functions. We find that "time delay embedding," i.e., structuring data prior to analysis by constructing a data matrix of overlapping samples, increases the precision of parameter estimates and in turn statistical power compared to standard…

  19. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  20. A Software Architecture for the Construction and Management of Real-Time Virtual Worlds

    DTIC Science & Technology

    1993-06-01

    University of California, Berkeley [FUNK921. The second improvement was the addition of a radiosity light model. The use of radiosity and its use of diffuse...the viewpoint is stationary, the coarse polygon model is replaced by progressively more complex radiosity lit scenes. The area of molecular modeling

  1. A Conceptual Model of Leisure-Time Choice Behavior.

    ERIC Educational Resources Information Center

    Bergier, Michel J.

    1981-01-01

    Methods of studying the gap between predisposition and actual behavior of consumers of spectator sports is discussed. A model is drawn from the areas of behavioral sciences, consumer behavior, and leisure research. The model is constructed around the premise that choice is primarily a function of personal, product, and environmental factors. (JN)

  2. A bi-objective model for robust yard allocation scheduling for outbound containers

    NASA Astrophysics Data System (ADS)

    Liu, Changchun; Zhang, Canrong; Zheng, Li

    2017-01-01

    This article examines the yard allocation problem for outbound containers, with consideration of uncertainty factors, mainly including the arrival and operation time of calling vessels. Based on the time buffer inserting method, a bi-objective model is constructed to minimize the total operational cost and to maximize the robustness of fighting against the uncertainty. Due to the NP-hardness of the constructed model, a two-stage heuristic is developed to solve the problem. In the first stage, initial solutions are obtained by a greedy algorithm that looks n-steps ahead with the uncertainty factors set as their respective expected values; in the second stage, based on the solutions obtained in the first stage and with consideration of uncertainty factors, a neighbourhood search heuristic is employed to generate robust solutions that can fight better against the fluctuation of uncertainty factors. Finally, extensive numerical experiments are conducted to test the performance of the proposed method.

  3. Construction of In Vivo Fluorescent Imaging of Echinococcus granulosus in a Mouse Model.

    PubMed

    Wang, Sibo; Yang, Tao; Zhang, Xuyong; Xia, Jie; Guo, Jun; Wang, Xiaoyi; Hou, Jixue; Zhang, Hongwei; Chen, Xueling; Wu, Xiangwei

    2016-06-01

    Human hydatid disease (cystic echinococcosis, CE) is a chronic parasitic infection caused by the larval stage of the cestode Echinococcus granulosus. As the disease mainly affects the liver, approximately 70% of all identified CE cases are detected in this organ. Optical molecular imaging (OMI), a noninvasive imaging technique, has never been used in vivo with the specific molecular markers of CE. Thus, we aimed to construct an in vivo fluorescent imaging mouse model of CE to locate and quantify the presence of the parasites within the liver noninvasively. Drug-treated protoscolices were monitored after marking by JC-1 dye in in vitro and in vivo studies. This work describes for the first time the successful construction of an in vivo model of E. granulosus in a small living experimental animal to achieve dynamic monitoring and observation of multiple time points of the infection course. Using this model, we quantified and analyzed labeled protoscolices based on the intensities of their red and green fluorescence. Interestingly, the ratio of red to green fluorescence intensity not only revealed the location of protoscolices but also determined the viability of the parasites in vivo and in vivo tests. The noninvasive imaging model proposed in this work will be further studied for long-term detection and observation and may potentially be widely utilized in susceptibility testing and therapeutic effect evaluation.

  4. [Construction of the addiction prevention core competency model for preventing addictive behavior in adolescents].

    PubMed

    Park, Hyun Sook; Jung, Sun Young

    2013-12-01

    This study was done to provide fundamental data for the development of competency reinforcement programs to prevent addictive behavior in adolescents through the construction and examination of an addiction prevention core competency model. In this study core competencies for preventing addictive behavior in adolescents through competency modeling were identified, and the addiction prevention core competency model was developed. It was validated methodologically. Competencies for preventing addictive behavior in adolescents as defined by the addiction prevention core competency model are as follows: positive self-worth, self-control skill, time management skill, reality perception skill, risk coping skill, and positive communication with parents and with peers or social group. After construction, concurrent cross validation of the addiction prevention core competency model showed that this model was appropriate. The study results indicate that the addiction prevention core competency model for the prevention of addictive behavior in adolescents through competency modeling can be used as a foundation for an integral approach to enhance adolescent is used as an adjective and prevent addictive behavior. This approach can be a school-centered, cost-efficient strategy which not only reduces addictive behavior in adolescents, but also improves the quality of their resources.

  5. Determination of sustainable values for the parameters of the construction of residential buildings

    NASA Astrophysics Data System (ADS)

    Grigoreva, Larisa; Grigoryev, Vladimir

    2018-03-01

    For the formation of programs for housing construction and planning of capital investments, when developing the strategic planning companies by construction companies, the norms or calculated indicators of the duration of the construction of high-rise residential buildings and multifunctional complexes are mandatory. Determination of stable values of the parameters for the high-rise construction residential buildings provides an opportunity to establish a reasonable duration of construction at the planning and design stages of residential complexes, taking into account the influence of market conditions factors. The concept of the formation of enlarged models for the high-rise construction residential buildings is based on a real mapping in time and space of the most significant redistribution with their organizational and technological interconnection - the preparatory period, the underground part, the above-ground part, external engineering networks, landscaping. The total duration of the construction of a residential building, depending on the duration of each redistribution and the degree of their overlapping, can be determined by one of the proposed four options. At the same time, a unified approach to determining the overall duration of construction on the basis of the provisions of a streamlined construction organization with the testing of results on the example of high-rise residential buildings of the typical I-155B series was developed, and the coefficients for combining the work and the main redevelopment of the building were determined.

  6. A Leisure Activities Curricular Component for Severely Handicapped Youth: Why and How.

    ERIC Educational Resources Information Center

    Voeltz, Luanna M.; Apffel, James A.

    1981-01-01

    A rationale for including a leisure time activities curriculum component in educational programing for severely handicapped individuals is presented. The importance of play and the constructive use of leisure time is described through the use of a model demonstration project. (JN)

  7. Oil Formation Volume Factor Determination Through a Fused Intelligence

    NASA Astrophysics Data System (ADS)

    Gholami, Amin

    2016-12-01

    Volume change of oil between reservoir condition and standard surface condition is called oil formation volume factor (FVF), which is very time, cost and labor intensive to determine. This study proposes an accurate, rapid and cost-effective approach for determining FVF from reservoir temperature, dissolved gas oil ratio, and specific gravity of both oil and dissolved gas. Firstly, structural risk minimization (SRM) principle of support vector regression (SVR) was employed to construct a robust model for estimating FVF from the aforementioned inputs. Subsequently, an alternating conditional expectation (ACE) was used for approximating optimal transformations of input/output data to a higher correlated data and consequently developing a sophisticated model between transformed data. Eventually, a committee machine with SVR and ACE was constructed through the use of hybrid genetic algorithm-pattern search (GA-PS). Committee machine integrates ACE and SVR models in an optimal linear combination such that makes benefit of both methods. A group of 342 data points was used for model development and a group of 219 data points was used for blind testing the constructed model. Results indicated that the committee machine performed better than individual models.

  8. Re-construction of action awareness depends on an internal model of action-outcome timing.

    PubMed

    Stenner, Max-Philipp; Bauer, Markus; Machts, Judith; Heinze, Hans-Jochen; Haggard, Patrick; Dolan, Raymond J

    2014-04-01

    The subjective time of an instrumental action is shifted towards its outcome. This temporal binding effect is partially retrospective, i.e., occurs upon outcome perception. Retrospective binding is thought to reflect post-hoc inference on agency based on sensory evidence of the action - outcome association. However, many previous binding paradigms cannot exclude the possibility that retrospective binding results from bottom-up interference of sensory outcome processing with action awareness and is functionally unrelated to the processing of the action - outcome association. Here, we keep bottom-up interference constant and use a contextual manipulation instead. We demonstrate a shift of subjective action time by its outcome in a context of variable outcome timing. Crucially, this shift is absent when there is no such variability. Thus, retrospective action binding reflects a context-dependent, model-based phenomenon. Such top-down re-construction of action awareness seems to bias agency attribution when outcome predictability is low. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Re-construction of action awareness depends on an internal model of action-outcome timing

    PubMed Central

    Stenner, Max-Philipp; Bauer, Markus; Machts, Judith; Heinze, Hans-Jochen; Haggard, Patrick; Dolan, Raymond J.

    2014-01-01

    The subjective time of an instrumental action is shifted towards its outcome. This temporal binding effect is partially retrospective, i.e., occurs upon outcome perception. Retrospective binding is thought to reflect post-hoc inference on agency based on sensory evidence of the action – outcome association. However, many previous binding paradigms cannot exclude the possibility that retrospective binding results from bottom-up interference of sensory outcome processing with action awareness and is functionally unrelated to the processing of the action – outcome association. Here, we keep bottom-up interference constant and use a contextual manipulation instead. We demonstrate a shift of subjective action time by its outcome in a context of variable outcome timing. Crucially, this shift is absent when there is no such variability. Thus, retrospective action binding reflects a context-dependent, model-based phenomenon. Such top-down re-construction of action awareness seems to bias agency attribution when outcome predictability is low. PMID:24555983

  10. Developmental Times of Chrysomya megacephala (Fabricius) (Diptera: Calliphoridae) at Constant Temperatures and Applications in Forensic Entomology.

    PubMed

    Yang, Yong-Qiang; Li, Xue-Bo; Shao, Ru-Yue; Lyu, Zhou; Li, Hong-Wei; Li, Gen-Ping; Xu, Lyu-Zi; Wan, Li-Hua

    2016-09-01

    The characteristic life stages of infesting blowflies (Calliphoridae) such as Chrysomya megacephala (Fabricius) are powerful evidence for estimating the death time of a corpse, but an established reference of developmental times for local blowfly species is required. We determined the developmental rates of C. megacephala from southwest China at seven constant temperatures (16-34°C). Isomegalen and isomorphen diagrams were constructed based on the larval length and time for each developmental event (first ecdysis, second ecdysis, wandering, pupariation, and eclosion), at each temperature. A thermal summation model was constructed by estimating the developmental threshold temperature D0 and the thermal summation constant K. The thermal summation model indicated that, for complete development from egg hatching to eclosion, D0 = 9.07 ± 0.54°C and K = 3991.07 ± 187.26 h °C. This reference can increase the accuracy of estimations of postmortem intervals in China by predicting the growth of C. megacephala. © 2016 American Academy of Forensic Sciences.

  11. Evaluation of an S-system root-finding method for estimating parameters in a metabolic reaction model.

    PubMed

    Iwata, Michio; Miyawaki-Kuwakado, Atsuko; Yoshida, Erika; Komori, Soichiro; Shiraishi, Fumihide

    2018-02-02

    In a mathematical model, estimation of parameters from time-series data of metabolic concentrations in cells is a challenging task. However, it seems that a promising approach for such estimation has not yet been established. Biochemical Systems Theory (BST) is a powerful methodology to construct a power-law type model for a given metabolic reaction system and to then characterize it efficiently. In this paper, we discuss the use of an S-system root-finding method (S-system method) to estimate parameters from time-series data of metabolite concentrations. We demonstrate that the S-system method is superior to the Newton-Raphson method in terms of the convergence region and iteration number. We also investigate the usefulness of a translocation technique and a complex-step differentiation method toward the practical application of the S-system method. The results indicate that the S-system method is useful to construct mathematical models for a variety of metabolic reaction networks. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Time prediction of failure a type of lamps by using general composite hazard rate model

    NASA Astrophysics Data System (ADS)

    Riaman; Lesmana, E.; Subartini, B.; Supian, S.

    2018-03-01

    This paper discusses the basic survival model estimates to obtain the average predictive value of lamp failure time. This estimate is for the parametric model, General Composite Hazard Level Model. The random time variable model used is the exponential distribution model, as the basis, which has a constant hazard function. In this case, we discuss an example of survival model estimation for a composite hazard function, using an exponential model as its basis. To estimate this model is done by estimating model parameters, through the construction of survival function and empirical cumulative function. The model obtained, will then be used to predict the average failure time of the model, for the type of lamp. By grouping the data into several intervals and the average value of failure at each interval, then calculate the average failure time of a model based on each interval, the p value obtained from the tes result is 0.3296.

  13. The experience of traumatic events disrupts the measurement invariance of a posttraumatic stress scale.

    PubMed

    Lommen, Miriam J J; van de Schoot, Rens; Engelhard, Iris M

    2014-01-01

    Studies that include multiple assessments of a particular instrument within the same population are based on the presumption that this instrument measures the same construct over time. But what if the meaning of the construct changes over time due to one's experiences? For example, the experience of a traumatic event can influence one's view of the world, others, and self, and may disrupt the stability of a questionnaire measuring posttraumatic stress symptoms (i.e., it may affect the interpretation of items). Nevertheless, assessments before and after such a traumatic event are crucial to study longitudinal development of posttraumatic stress symptoms. In this study, we examined measurement invariance of posttraumatic stress symptoms in a sample of Dutch soldiers before and after they went on deployment to Afghanistan (N = 249). Results showed that the underlying measurement model before deployment was different from the measurement model after deployment due to invariant item thresholds. These results were replicated in a sample of soldiers deployed to Iraq (N = 305). Since the lack of measurement invariance was due to instability of the majority of the items, it seems reasonable to conclude that the underlying construct of PSS is unstable over time if war-zone related traumatic events occur in between measurements. From a statistical point of view, the scores over time cannot be compared when there is a lack of measurement invariance. The main message of this paper is that researchers working with posttraumatic stress questionnaires in longitudinal studies should not take measurement invariance for granted, but should use pre- and post-symptom scores as different constructs for each time point in the analysis.

  14. Nonparametric model validations for hidden Markov models with applications in financial econometrics

    PubMed Central

    Zhao, Zhibiao

    2011-01-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

  15. The construction of a two-dimensional reproducing kernel function and its application in a biomedical model.

    PubMed

    Guo, Qi; Shen, Shu-Ting

    2016-04-29

    There are two major classes of cardiac tissue models: the ionic model and the FitzHugh-Nagumo model. During computer simulation, each model entails solving a system of complex ordinary differential equations and a partial differential equation with non-flux boundary conditions. The reproducing kernel method possesses significant applications in solving partial differential equations. The derivative of the reproducing kernel function is a wavelet function, which has local properties and sensitivities to singularity. Therefore, study on the application of reproducing kernel would be advantageous. Applying new mathematical theory to the numerical solution of the ventricular muscle model so as to improve its precision in comparison with other methods at present. A two-dimensional reproducing kernel function inspace is constructed and applied in computing the solution of two-dimensional cardiac tissue model by means of the difference method through time and the reproducing kernel method through space. Compared with other methods, this method holds several advantages such as high accuracy in computing solutions, insensitivity to different time steps and a slow propagation speed of error. It is suitable for disorderly scattered node systems without meshing, and can arbitrarily change the location and density of the solution on different time layers. The reproducing kernel method has higher solution accuracy and stability in the solutions of the two-dimensional cardiac tissue model.

  16. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  17. The Impact of Time Delay on the Content of Discussions at a Computer-Mediated Conference

    NASA Astrophysics Data System (ADS)

    Huntley, Byron C.; Thatcher, Andrew

    2008-11-01

    This study investigates the relationship between the content of computer-mediated discussions and the time delay between online postings. The study aims to broaden understanding of the dynamics of computer-mediated discussion regarding the time delay and the actual content of computer-mediated discussions (knowledge construction, social aspects, amount of words and number of postings) which has barely been researched. The computer-mediated discussions of the CybErg 2005 virtual conference served as the sample for this study. The Interaction Analysis Model [1] was utilized to analyze the level of knowledge construction in the content of the computer-mediated discussions. Correlations have been computed for all combinations of the variables. The results demonstrate that knowledge construction, social aspects and amount of words generated within postings were independent of, and not affected by, the time delay between the postings and the posting from which the reply was formulated. When greater numbers of words were utilized within postings, this was typically associated with a greater level of knowledge construction. Social aspects in the discussion were found to neither advantage nor disadvantage the overall effectiveness of the computer-mediated discussion.

  18. An Integrated Framework for Process-Driven Model Construction in Disease Ecology and Animal Health

    PubMed Central

    Mancy, Rebecca; Brock, Patrick M.; Kao, Rowland R.

    2017-01-01

    Process models that focus on explicitly representing biological mechanisms are increasingly important in disease ecology and animal health research. However, the large number of process modelling approaches makes it difficult to decide which is most appropriate for a given disease system and research question. Here, we discuss different motivations for using process models and present an integrated conceptual analysis that can be used to guide the construction of infectious disease process models and comparisons between them. Our presentation complements existing work by clarifying the major differences between modelling approaches and their relationship with the biological characteristics of the epidemiological system. We first discuss distinct motivations for using process models in epidemiological research, identifying the key steps in model design and use associated with each. We then present a conceptual framework for guiding model construction and comparison, organised according to key aspects of epidemiological systems. Specifically, we discuss the number and type of disease states, whether to focus on individual hosts (e.g., cows) or groups of hosts (e.g., herds or farms), how space or host connectivity affect disease transmission, whether demographic and epidemiological processes are periodic or can occur at any time, and the extent to which stochasticity is important. We use foot-and-mouth disease and bovine tuberculosis in cattle to illustrate our discussion and support explanations of cases in which different models are used to address similar problems. The framework should help those constructing models to structure their approach to modelling decisions and facilitate comparisons between models in the literature. PMID:29021983

  19. An Integrated Framework for Process-Driven Model Construction in Disease Ecology and Animal Health.

    PubMed

    Mancy, Rebecca; Brock, Patrick M; Kao, Rowland R

    2017-01-01

    Process models that focus on explicitly representing biological mechanisms are increasingly important in disease ecology and animal health research. However, the large number of process modelling approaches makes it difficult to decide which is most appropriate for a given disease system and research question. Here, we discuss different motivations for using process models and present an integrated conceptual analysis that can be used to guide the construction of infectious disease process models and comparisons between them. Our presentation complements existing work by clarifying the major differences between modelling approaches and their relationship with the biological characteristics of the epidemiological system. We first discuss distinct motivations for using process models in epidemiological research, identifying the key steps in model design and use associated with each. We then present a conceptual framework for guiding model construction and comparison, organised according to key aspects of epidemiological systems. Specifically, we discuss the number and type of disease states, whether to focus on individual hosts (e.g., cows) or groups of hosts (e.g., herds or farms), how space or host connectivity affect disease transmission, whether demographic and epidemiological processes are periodic or can occur at any time, and the extent to which stochasticity is important. We use foot-and-mouth disease and bovine tuberculosis in cattle to illustrate our discussion and support explanations of cases in which different models are used to address similar problems. The framework should help those constructing models to structure their approach to modelling decisions and facilitate comparisons between models in the literature.

  20. Timing of Gestures: Gestures Anticipating or Simultaneous with Speech as Indexes of Text Comprehension in Children and Adults

    ERIC Educational Resources Information Center

    Ianì, Francesco; Cutica, Ilaria; Bucciarelli, Monica

    2017-01-01

    The deep comprehension of a text is tantamount to the construction of an articulated mental model of that text. The number of correct recollections is an index of a learner's mental model of a text. We assume that another index of comprehension is the timing of the gestures produced during text recall; gestures are simultaneous with speech when…

  1. Nonstationary time series prediction combined with slow feature analysis

    NASA Astrophysics Data System (ADS)

    Wang, G.; Chen, X.

    2015-07-01

    Almost all climate time series have some degree of nonstationarity due to external driving forces perturbing the observed system. Therefore, these external driving forces should be taken into account when constructing the climate dynamics. This paper presents a new technique of obtaining the driving forces of a time series from the slow feature analysis (SFA) approach, and then introduces them into a predictive model to predict nonstationary time series. The basic theory of the technique is to consider the driving forces as state variables and to incorporate them into the predictive model. Experiments using a modified logistic time series and winter ozone data in Arosa, Switzerland, were conducted to test the model. The results showed improved prediction skills.

  2. Development of a decision model for selection of appropriate timely delivery techniques for highway projects : final report, April 2009.

    DOT National Transportation Integrated Search

    2009-04-01

    The primary umbrella method used by the Oregon Department of Transportation (ODOT) to ensure on-time performance in standard construction contracting is liquidated damages. The assessment value is usually a matter of some judgment. In practice,...

  3. Hands On Earth Science.

    ERIC Educational Resources Information Center

    Weisgarber, Sherry L.; Van Doren, Lisa; Hackathorn, Merrianne; Hannibal, Joseph T.; Hansgen, Richard

    This publication is a collection of 13 hands-on activities that focus on earth science-related activities and involve students in learning about growing crystals, tectonics, fossils, rock and minerals, modeling Ohio geology, geologic time, determining true north, and constructing scale-models of the Earth-moon system. Each activity contains…

  4. Fast Mix Table Construction for Material Discretization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Seth R

    2013-01-01

    An effective hybrid Monte Carlo--deterministic implementation typically requires the approximation of a continuous geometry description with a discretized piecewise-constant material field. The inherent geometry discretization error can be reduced somewhat by using material mixing, where multiple materials inside a discrete mesh voxel are homogenized. Material mixing requires the construction of a ``mix table,'' which stores the volume fractions in every mixture so that multiple voxels with similar compositions can reference the same mixture. Mix table construction is a potentially expensive serial operation for large problems with many materials and voxels. We formulate an efficient algorithm to construct a sparse mix table inmore » $$O(\\text{number of voxels}\\times \\log \\text{number of mixtures})$$ time. The new algorithm is implemented in ADVANTG and used to discretize continuous geometries onto a structured Cartesian grid. When applied to an end-of-life MCNP model of the High Flux Isotope Reactor with 270 distinct materials, the new method improves the material mixing time by a factor of 100 compared to a naive mix table implementation.« less

  5. Development of a semi-automated model identification and calibration tool for conceptual modelling of sewer systems.

    PubMed

    Wolfs, Vincent; Villazon, Mauricio Florencio; Willems, Patrick

    2013-01-01

    Applications such as real-time control, uncertainty analysis and optimization require an extensive number of model iterations. Full hydrodynamic sewer models are not sufficient for these applications due to the excessive computation time. Simplifications are consequently required. A lumped conceptual modelling approach results in a much faster calculation. The process of identifying and calibrating the conceptual model structure could, however, be time-consuming. Moreover, many conceptual models lack accuracy, or do not account for backwater effects. To overcome these problems, a modelling methodology was developed which is suited for semi-automatic calibration. The methodology is tested for the sewer system of the city of Geel in the Grote Nete river basin in Belgium, using both synthetic design storm events and long time series of rainfall input. A MATLAB/Simulink(®) tool was developed to guide the modeller through the step-wise model construction, reducing significantly the time required for the conceptual modelling process.

  6. Mobility timing for agent communities, a cue for advanced connectionist systems.

    PubMed

    Apolloni, Bruno; Bassis, Simone; Pagani, Elena; Rossi, Gian Paolo; Valerio, Lorenzo

    2011-12-01

    We introduce a wait-and-chase scheme that models the contact times between moving agents within a connectionist construct. The idea that elementary processors move within a network to get a proper position is borne out both by biological neurons in the brain morphogenesis and by agents within social networks. From the former, we take inspiration to devise a medium-term project for new artificial neural network training procedures where mobile neurons exchange data only when they are close to one another in a proper space (are in contact). From the latter, we accumulate mobility tracks experience. We focus on the preliminary step of characterizing the elapsed time between neuron contacts, which results from a spatial process fitting in the family of random processes with memory, where chasing neurons are stochastically driven by the goal of hitting target neurons. Thus, we add an unprecedented mobility model to the literature in the field, introducing a distribution law of the intercontact times that merges features of both negative exponential and Pareto distribution laws. We give a constructive description and implementation of our model, as well as a short analytical form whose parameters are suitably estimated in terms of confidence intervals from experimental data. Numerical experiments show the model and related inference tools to be sufficiently robust to cope with two main requisites for its exploitation in a neural network: the nonindependence of the observed intercontact times and the feasibility of the model inversion problem to infer suitable mobility parameters.

  7. Patterns of Response Times and Response Choices to Science Questions: The Influence of Relative Processing Time

    ERIC Educational Resources Information Center

    Heckler, Andrew F.; Scaife, Thomas M.

    2015-01-01

    We report on five experiments investigating response choices and response times to simple science questions that evoke student "misconceptions," and we construct a simple model to explain the patterns of response choices. Physics students were asked to compare a physical quantity represented by the slope, such as speed, on simple physics…

  8. Modeling elastic wave propagation in kidney stones with application to shock wave lithotripsy.

    PubMed

    Cleveland, Robin O; Sapozhnikov, Oleg A

    2005-10-01

    A time-domain finite-difference solution to the equations of linear elasticity was used to model the propagation of lithotripsy waves in kidney stones. The model was used to determine the loading on the stone (principal stresses and strains and maximum shear stresses and strains) due to the impact of lithotripsy shock waves. The simulations show that the peak loading induced in kidney stones is generated by constructive interference from shear waves launched from the outer edge of the stone with other waves in the stone. Notably the shear wave induced loads were significantly larger than the loads generated by the classic Hopkinson or spall effect. For simulations where the diameter of the focal spot of the lithotripter was smaller than that of the stone the loading decreased by more than 50%. The constructive interference was also sensitive to shock rise time and it was found that the peak tensile stress reduced by 30% as rise time increased from 25 to 150 ns. These results demonstrate that shear waves likely play a critical role in stone comminution and that lithotripters with large focal widths and short rise times should be effective at generating high stresses inside kidney stones.

  9. Learning Models and Real-Time Speech Recognition.

    ERIC Educational Resources Information Center

    Danforth, Douglas G.; And Others

    This report describes the construction and testing of two "psychological" learning models for the purpose of computer recognition of human speech over the telephone. One of the two models was found to be superior in all tests. A regression analysis yielded a 92.3% recognition rate for 14 subjects ranging in age from 6 to 13 years. Tests…

  10. On the Origin and Evolution of Stellar Chromospheres, Coronae and Winds

    NASA Technical Reports Server (NTRS)

    Musielak, Z. E.

    2000-01-01

    This grant was awarded by NASA to The University of Alabama in Huntsville (UAH) to construct state-of-the-art, theoretical, two-component, chromospheric models for single stars of different spectral types and different evolutionary status. In our proposal, we suggested to use these models to predict the level of the "basal flux", the observed range of variation of chromospheric activity for a given spectral type, and the decrease of this activity with stellar age. In addition, for red giants and supergiants, we also proposed to construct self-consistent, purely theoretical wind models, and used these models to investigate the origin of "dividing lines" in the H-R diagram. In the following, we describe our completed work. We have accomplished the first main goal of our proposal by constructing first purely theoretical, time-dependent and two-component models of stellar chromospheres.1 The models require specifying only three basic stellar parameters, namely, the effective temperature, gravity and rotation rate, and they take into account non-magnetic and magnetic regions in stellar chromospheres. The non-magnetic regions are heated by acoustic waves generated by the turbulent convection in the stellar subphotospheric layers. The magnetic regions are identified with magnetic flux tubes uniformly distributed over the entire stellar surface and they are heated by longitudinal tube waves generated by turbulent motions in the subphotospheric and photospheric layers. The coverage of stellar surface by magnetic regions (the so-called filling factor) is estimated for a given rotation rate from an observational relationship. The constructed models are time-dependent and are based on the energy balance between the amount of mechanical energy supplied by waves and radiative losses in strong Ca II and Mg II emission lines. To calculate the amount of wave energy in the non-magnetic regions, we have used the Lighthill-Stein theory for sound generation.

  11. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  12. Multi-scale modeling of tsunami flows and tsunami-induced forces

    NASA Astrophysics Data System (ADS)

    Qin, X.; Motley, M. R.; LeVeque, R. J.; Gonzalez, F. I.

    2016-12-01

    The modeling of tsunami flows and tsunami-induced forces in coastal communities with the incorporation of the constructed environment is challenging for many numerical modelers because of the scale and complexity of the physical problem. A two-dimensional (2D) depth-averaged model can be efficient for modeling of waves offshore but may not be accurate enough to predict the complex flow with transient variance in vertical direction around constructed environments on land. On the other hand, using a more complex three-dimensional model is much more computational expensive and can become impractical due to the size of the problem and the meshing requirements near the built environment. In this study, a 2D depth-integrated model and a 3D Reynolds Averaged Navier-Stokes (RANS) model are built to model a 1:50 model-scale, idealized community, representative of Seaside, OR, USA, for which existing experimental data is available for comparison. Numerical results from the two numerical models are compared with each other as well as experimental measurement. Both models predict the flow parameters (water level, velocity, and momentum flux in the vicinity of the buildings) accurately, in general, except for time period near the initial impact, where the depth-averaged models can fail to capture the complexities in the flow. Forces predicted using direct integration of predicted pressure on structural surfaces from the 3D model and using momentum flux from the 2D model with constructed environment are compared, which indicates that force prediction from the 2D model is not always reliable in such a complicated case. Force predictions from integration of the pressure are also compared with forces predicted from bare earth momentum flux calculations to reveal the importance of incorporating the constructed environment in force prediction models.

  13. Selection of optimal complexity for ENSO-EMR model by minimum description length principle

    NASA Astrophysics Data System (ADS)

    Loskutov, E. M.; Mukhin, D.; Mukhina, A.; Gavrilov, A.; Kondrashov, D. A.; Feigin, A. M.

    2012-12-01

    One of the main problems arising in modeling of data taken from natural system is finding a phase space suitable for construction of the evolution operator model. Since we usually deal with strongly high-dimensional behavior, we are forced to construct a model working in some projection of system phase space corresponding to time scales of interest. Selection of optimal projection is non-trivial problem since there are many ways to reconstruct phase variables from given time series, especially in the case of a spatio-temporal data field. Actually, finding optimal projection is significant part of model selection, because, on the one hand, the transformation of data to some phase variables vector can be considered as a required component of the model. On the other hand, such an optimization of a phase space makes sense only in relation to the parametrization of the model we use, i.e. representation of evolution operator, so we should find an optimal structure of the model together with phase variables vector. In this paper we propose to use principle of minimal description length (Molkov et al., 2009) for selection models of optimal complexity. The proposed method is applied to optimization of Empirical Model Reduction (EMR) of ENSO phenomenon (Kravtsov et al. 2005, Kondrashov et. al., 2005). This model operates within a subset of leading EOFs constructed from spatio-temporal field of SST in Equatorial Pacific, and has a form of multi-level stochastic differential equations (SDE) with polynomial parameterization of the right-hand side. Optimal values for both the number of EOF, the order of polynomial and number of levels are estimated from the Equatorial Pacific SST dataset. References: Ya. Molkov, D. Mukhin, E. Loskutov, G. Fidelin and A. Feigin, Using the minimum description length principle for global reconstruction of dynamic systems from noisy time series, Phys. Rev. E, Vol. 80, P 046207, 2009 Kravtsov S, Kondrashov D, Ghil M, 2005: Multilevel regression modeling of nonlinear processes: Derivation and applications to climatic variability. J. Climate, 18 (21): 4404-4424. D. Kondrashov, S. Kravtsov, A. W. Robertson and M. Ghil, 2005. A hierarchy of data-based ENSO models. J. Climate, 18, 4425-4444.

  14. Robust sensor fault detection and isolation of gas turbine engines subjected to time-varying parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Pourbabaee, Bahareh; Meskin, Nader; Khorasani, Khashayar

    2016-08-01

    In this paper, a novel robust sensor fault detection and isolation (FDI) strategy using the multiple model-based (MM) approach is proposed that remains robust with respect to both time-varying parameter uncertainties and process and measurement noise in all the channels. The scheme is composed of robust Kalman filters (RKF) that are constructed for multiple piecewise linear (PWL) models that are constructed at various operating points of an uncertain nonlinear system. The parameter uncertainty is modeled by using a time-varying norm bounded admissible structure that affects all the PWL state space matrices. The robust Kalman filter gain matrices are designed by solving two algebraic Riccati equations (AREs) that are expressed as two linear matrix inequality (LMI) feasibility conditions. The proposed multiple RKF-based FDI scheme is simulated for a single spool gas turbine engine to diagnose various sensor faults despite the presence of parameter uncertainties, process and measurement noise. Our comparative studies confirm the superiority of our proposed FDI method when compared to the methods that are available in the literature.

  15. Application of 3D Laser Scanning Technology in Complex Rock Foundation Design

    NASA Astrophysics Data System (ADS)

    Junjie, Ma; Dan, Lu; Zhilong, Liu

    2017-12-01

    Taking the complex landform of Tanxi Mountain Landscape Bridge as an example, the application of 3D laser scanning technology in the mapping of complex rock foundations is studied in this paper. A set of 3D laser scanning technologies are formed and several key engineering problems are solved. The first is 3D laser scanning technology of complex landforms. 3D laser scanning technology is used to obtain a complete 3D point cloud data model of the complex landform. The detailed and accurate results of the surveying and mapping decrease the measuring time and supplementary measuring times. The second is 3D collaborative modeling of the complex landform. A 3D model of the complex landform is established based on the 3D point cloud data model. The super-structural foundation model is introduced for 3D collaborative design. The optimal design plan is selected and the construction progress is accelerated. And the last is finite-element analysis technology of the complex landform foundation. A 3D model of the complex landform is introduced into ANSYS for building a finite element model to calculate anti-slide stability of the rock, and provides a basis for the landform foundation design and construction.

  16. Space manufacturing in the construction of solar power satellites

    NASA Astrophysics Data System (ADS)

    Ruth, J.; Westphal, W.

    This paper deals with ongoing research work concerning energy budget and cost of the solar Satellite Power System (SPS). The fundamental model of such a total system including ground and space facilities, transportation vehicles, power satellites and rectennas is presented. The main purpose of this model is to examine the applicability of different construction scenarios to allow comparison under nearly identical constraints. Using this model in a first attempt the blankets—meaning the main part of the space segment by weight, energy investment needs and cost—are chosen representatively for the energy and cost comparison of two construction alternatives of the same SPS concept. These construction alternatives are defined just by ground and space based manufacturing of the solar blankets, while all other subsystems, operations and the transportation profiles are considered to be kept the same. It can be shown that the energy "payback" time does not only depend on the SPS concept selected but also very much on the construction and implementation scenario. The cost comparison of these alternative approaches presents not very significant differences but advantages for the space manufacturing option with potential higher differences for a less conservative approach which may apply benefits of space manufacturing meaning, for example, considerable mass savings in space. Some preliminary results are discussed and an outlook is given over the next steps to be investigated, comprising the extension of the fundamental model to include use of lunar raw materials.

  17. Effects of a Brief Psychoeducational Intervention for Family Conflict: Constructive Conflict, Emotional Insecurity and Child Adjustment.

    PubMed

    Miller-Graff, Laura E; Cummings, E Mark; Bergman, Kathleen N

    2016-10-01

    The role of emotional security in promoting positive adjustment following exposure to marital conflict has been identified in a large number of empirical investigations, yet to date, no interventions have explicitly addressed the processes that predict child adjustment after marital conflict. The current study evaluated a randomized controlled trial of a family intervention program aimed at promoting constructive marital conflict behaviors thereby increasing adolescent emotional security and adjustment. Families (n = 225) were randomized into 1 of 4 conditions: Parent-Adolescent (n = 75), Parent-Only (n = 75), Self-Study (n = 38) and No Treatment (n = 37). Multi-informant and multi-method assessments were conducted at baseline, post-treatment and 6-month follow-up. Effects of treatment on destructive and constructive conflict behaviors were evaluated using multilevel models where observations were nested within individuals over time. Process models assessing the impact of constructive and destructive conflict behaviors on emotional insecurity and adolescent adjustment were evaluated using path modeling. Results indicated that the treatment was effective in increasing constructive conflict behaviors (d = 0.89) and decreasing destructive conflict behaviors (d = -0.30). For the Parent-Only Group, post-test constructive conflict behaviors directly predicted lower levels of adolescent externalizing behaviors at 6-month follow-up. Post-test constructive conflict skills also indirectly affected adolescent internalizing behaviors through adolescent emotional security. These findings support the use of a brief psychoeducational intervention in improving post-treatment conflict and emotional security about interparental relationships.

  18. Reduced nonlinear prognostic model construction from high-dimensional data

    NASA Astrophysics Data System (ADS)

    Gavrilov, Andrey; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander

    2017-04-01

    Construction of a data-driven model of evolution operator using universal approximating functions can only be statistically justified when the dimension of its phase space is small enough, especially in the case of short time series. At the same time in many applications real-measured data is high-dimensional, e.g. it is space-distributed and multivariate in climate science. Therefore it is necessary to use efficient dimensionality reduction methods which are also able to capture key dynamical properties of the system from observed data. To address this problem we present a Bayesian approach to an evolution operator construction which incorporates two key reduction steps. First, the data is decomposed into a set of certain empirical modes, such as standard empirical orthogonal functions or recently suggested nonlinear dynamical modes (NDMs) [1], and the reduced space of corresponding principal components (PCs) is obtained. Then, the model of evolution operator for PCs is constructed which maps a number of states in the past to the current state. The second step is to reduce this time-extended space in the past using appropriate decomposition methods. Such a reduction allows us to capture only the most significant spatio-temporal couplings. The functional form of the evolution operator includes separately linear, nonlinear (based on artificial neural networks) and stochastic terms. Explicit separation of the linear term from the nonlinear one allows us to more easily interpret degree of nonlinearity as well as to deal better with smooth PCs which can naturally occur in the decompositions like NDM, as they provide a time scale separation. Results of application of the proposed method to climate data are demonstrated and discussed. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep15510

  19. Psychosocial safety climate, emotional demands, burnout, and depression: a longitudinal multilevel study in the Malaysian private sector.

    PubMed

    Idris, Mohd Awang; Dollard, Maureen F; Yulita

    2014-07-01

    This multilevel longitudinal study investigates a newly identified climate construct, psychosocial safety climate (PSC), as a precursor to job characteristics (e.g., emotional demands), and psychological outcomes (i.e., emotional exhaustion and depression). We argued that PSC, as an organizational climate construct, has cross-level effects on individually perceived job design and psychological outcomes. We hypothesized a mediation process between PSC and emotional exhaustion particularly through emotional demands. In sequence, we predicted that emotional exhaustion would predict depression. At Time 1, data were collected from employees in 36 Malaysian private sector organizations (80% responses rate), n = 253 (56%), and at Time 2 from 27 organizations (60%) and n = 117 (46%). Using hierarchical linear modeling (HLM), we found that there were cross-level effects of PSC Time 1 on emotional demands Time 2 and emotional exhaustion Time 2, but not on depression Time 2, across a 3-month time lag. We found evidence for a lagged mediated effect; emotional demands mediated the relationship between PSC and emotional exhaustion. Emotional exhaustion did not predict depression. Finally, our results suggest that PSC is an important organizational climate construct, and acts to reduce employee psychological problems in the workplace, via working conditions.

  20. Modeling multivariate time series on manifolds with skew radial basis functions.

    PubMed

    Jamshidi, Arta A; Kirby, Michael J

    2011-01-01

    We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.

  1. Shape prior modeling using sparse representation and online dictionary learning.

    PubMed

    Zhang, Shaoting; Zhan, Yiqiang; Zhou, Yan; Uzunbas, Mustafa; Metaxas, Dimitris N

    2012-01-01

    The recently proposed sparse shape composition (SSC) opens a new avenue for shape prior modeling. Instead of assuming any parametric model of shape statistics, SSC incorporates shape priors on-the-fly by approximating a shape instance (usually derived from appearance cues) by a sparse combination of shapes in a training repository. Theoretically, one can increase the modeling capability of SSC by including as many training shapes in the repository. However, this strategy confronts two limitations in practice. First, since SSC involves an iterative sparse optimization at run-time, the more shape instances contained in the repository, the less run-time efficiency SSC has. Therefore, a compact and informative shape dictionary is preferred to a large shape repository. Second, in medical imaging applications, training shapes seldom come in one batch. It is very time consuming and sometimes infeasible to reconstruct the shape dictionary every time new training shapes appear. In this paper, we propose an online learning method to address these two limitations. Our method starts from constructing an initial shape dictionary using the K-SVD algorithm. When new training shapes come, instead of re-constructing the dictionary from the ground up, we update the existing one using a block-coordinates descent approach. Using the dynamically updated dictionary, sparse shape composition can be gracefully scaled up to model shape priors from a large number of training shapes without sacrificing run-time efficiency. Our method is validated on lung localization in X-Ray and cardiac segmentation in MRI time series. Compared to the original SSC, it shows comparable performance while being significantly more efficient.

  2. Monitoring and Modeling Performance of Communications in Computational Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael A.; Le, Thuy T.

    2003-01-01

    Computational grids may include many machines located in a number of sites. For efficient use of the grid we need to have an ability to estimate the time it takes to communicate data between the machines. For dynamic distributed grids it is unrealistic to know exact parameters of the communication hardware and the current communication traffic and we should rely on a model of the network performance to estimate the message delivery time. Our approach to a construction of such a model is based on observation of the messages delivery time with various message sizes and time scales. We record these observations in a database and use them to build a model of the message delivery time. Our experiments show presence of multiple bands in the logarithm of the message delivery times. These multiple bands represent multiple paths messages travel between the grid machines and are incorporated in our multiband model.

  3. Model construction by students within an integrated medical curriculum.

    PubMed

    Barling, Peter M; Ramasamy, Perumal

    2011-03-01

    This paper presents our experience of running a special study module (SSM) in the second semester of the first year of our 5-year medical programme, worth 10 per cent of that semester's assessment, in which each student constructs an individually selected model illustrating a specific aspect of the teaching course. Each student conceptualises and develops his or her model, to clarify a specific aspect of medical teaching. The use of non-traditional materials in construction is strongly encouraged. Six weeks later, each student presents their model for assessment by four first-year academic teaching staff. The student is quizzed about the concepts that he or she presents, the mode of construction and the materials used. The students' projects broadly cover the disciplines of physiology, biochemistry and anatomy, but are somewhat biased towards anatomy. Students spend on average about 14 hours planning and building their models, at a time when they are busy with other teaching activities. The marks awarded for the projects closely follow a normal distribution. A survey suggests that most students enjoy the exercise and feel that it has enhanced their learning and understanding. It is clear from the wide variety of different topics, models and materials that students are highly resourceful in their modelling. Creative activity does not generally play a substantial part in medical education, but is of considerable importance. The development of their models stimulates, informs and educates the constructors, and provides a teaching resource for later use in didactic teaching. © Blackwell Publishing Ltd 2011.

  4. Constructing an Efficient Self-Tuning Aircraft Engine Model for Control and Health Management Applications

    NASA Technical Reports Server (NTRS)

    Armstrong, Jeffrey B.; Simon, Donald L.

    2012-01-01

    Self-tuning aircraft engine models can be applied for control and health management applications. The self-tuning feature of these models minimizes the mismatch between any given engine and the underlying engineering model describing an engine family. This paper provides details of the construction of a self-tuning engine model centered on a piecewise linear Kalman filter design. Starting from a nonlinear transient aerothermal model, a piecewise linear representation is first extracted. The linearization procedure creates a database of trim vectors and state-space matrices that are subsequently scheduled for interpolation based on engine operating point. A series of steady-state Kalman gains can next be constructed from a reduced-order form of the piecewise linear model. Reduction of the piecewise linear model to an observable dimension with respect to available sensed engine measurements can be achieved using either a subset or an optimal linear combination of "health" parameters, which describe engine performance. The resulting piecewise linear Kalman filter is then implemented for faster-than-real-time processing of sensed engine measurements, generating outputs appropriate for trending engine performance, estimating both measured and unmeasured parameters for control purposes, and performing on-board gas-path fault diagnostics. Computational efficiency is achieved by designing multidimensional interpolation algorithms that exploit the shared scheduling of multiple trim vectors and system matrices. An example application illustrates the accuracy of a self-tuning piecewise linear Kalman filter model when applied to a nonlinear turbofan engine simulation. Additional discussions focus on the issue of transient response accuracy and the advantages of a piecewise linear Kalman filter in the context of validation and verification. The techniques described provide a framework for constructing efficient self-tuning aircraft engine models from complex nonlinear simulations.Self-tuning aircraft engine models can be applied for control and health management applications. The self-tuning feature of these models minimizes the mismatch between any given engine and the underlying engineering model describing an engine family. This paper provides details of the construction of a self-tuning engine model centered on a piecewise linear Kalman filter design. Starting from a nonlinear transient aerothermal model, a piecewise linear representation is first extracted. The linearization procedure creates a database of trim vectors and state-space matrices that are subsequently scheduled for interpolation based on engine operating point. A series of steady-state Kalman gains can next be constructed from a reduced-order form of the piecewise linear model. Reduction of the piecewise linear model to an observable dimension with respect to available sensed engine measurements can be achieved using either a subset or an optimal linear combination of "health" parameters, which describe engine performance. The resulting piecewise linear Kalman filter is then implemented for faster-than-real-time processing of sensed engine measurements, generating outputs appropriate for trending engine performance, estimating both measured and unmeasured parameters for control purposes, and performing on-board gas-path fault diagnostics. Computational efficiency is achieved by designing multidimensional interpolation algorithms that exploit the shared scheduling of multiple trim vectors and system matrices. An example application illustrates the accuracy of a self-tuning piecewise linear Kalman filter model when applied to a nonlinear turbofan engine simulation. Additional discussions focus on the issue of transient response accuracy and the advantages of a piecewise linear Kalman filter in the context of validation and verification. The techniques described provide a framework for constructing efficient self-tuning aircraft engine models from complex nonlinear simulatns.

  5. Non-parametric directionality analysis - Extension for removal of a single common predictor and application to time series.

    PubMed

    Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob

    2016-08-01

    The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Atomic quantum simulation of dynamical gauge fields coupled to fermionic matter: from string breaking to evolution after a quench.

    PubMed

    Banerjee, D; Dalmonte, M; Müller, M; Rico, E; Stebler, P; Wiese, U-J; Zoller, P

    2012-10-26

    Using a Fermi-Bose mixture of ultracold atoms in an optical lattice, we construct a quantum simulator for a U(1) gauge theory coupled to fermionic matter. The construction is based on quantum links which realize continuous gauge symmetry with discrete quantum variables. At low energies, quantum link models with staggered fermions emerge from a Hubbard-type model which can be quantum simulated. This allows us to investigate string breaking as well as the real-time evolution after a quench in gauge theories, which are inaccessible to classical simulation methods.

  7. Semiclassical description of resonance-assisted tunneling in one-dimensional integrable models

    NASA Astrophysics Data System (ADS)

    Le Deunff, Jérémy; Mouchet, Amaury; Schlagheck, Peter

    2013-10-01

    Resonance-assisted tunneling is investigated within the framework of one-dimensional integrable systems. We present a systematic recipe, based on Hamiltonian normal forms, to construct one-dimensional integrable models that exhibit resonance island chain structures with accurately controlled sizes and positions of the islands. Using complex classical trajectories that evolve along suitably defined paths in the complex time domain, we construct a semiclassical theory of the resonance-assisted tunneling process. This semiclassical approach yields a compact analytical expression for tunnelling-induced level splittings which is found to be in very good agreement with the exact splittings obtained through numerical diagonalization.

  8. I-Wire Heart-on-a-Chip II: Biomechanical analysis of contractile, three-dimensional cardiomyocyte tissue constructs.

    PubMed

    Schroer, Alison K; Shotwell, Matthew S; Sidorov, Veniamin Y; Wikswo, John P; Merryman, W David

    2017-01-15

    This companion study presents the biomechanical analysis of the "I-Wire" platform using a modified Hill model of muscle mechanics that allows for further characterization of construct function and response to perturbation. The I-Wire engineered cardiac tissue construct (ECTC) is a novel experimental platform to investigate cardiac cell mechanics during auxotonic contraction. Whereas passive biomaterials often exhibit nonlinear and dissipative behavior, active tissue equivalents, such as ECTCs, also expend metabolic energy to perform mechanical work that presents additional challenges in quantifying their properties. The I-Wire model uses the passive mechanical response to increasing applied tension to measure the inherent stress and resistance to stretch of the construct before, during, and after treatments. Both blebbistatin and isoproterenol reduced prestress and construct stiffness; however, blebbistatin treatment abolished subsequent force-generating potential while isoproterenol enhanced this property. We demonstrate that the described model can replicate the response of these constructs to intrinsic changes in force-generating potential in response to both increasing frequency of stimulation and decreasing starting length. This analysis provides a useful mathematical model of the I-Wire platform, increases the number of parameters that can be derived from the device, and serves as a demonstration of quantitative characterization of nonlinear, active biomaterials. We anticipate that this quantitative analysis of I-Wire constructs will prove useful for qualifying patient-specific cardiomyocytes and fibroblasts prior to their utilization for cardiac regenerative medicine. Passive biomaterials may have non-linear elasticity and losses, but engineered muscle tissue also exhibits time- and force-dependent contractions. Historically, mathematical muscle models include series-elastic, parallel-elastic, contractile, and viscous elements. While hearts-on-a-chip can demonstrate in vitro the contractile properties of engineered cardiac constructs and their response to drugs, most of these use cellular monolayers that cannot be readily probed with controlled forces. The I-Wire platform described in the preceding paper by Sidorov et al. addresses these limitations with three-dimensional tissue constructs to which controlled forces can be applied. In this companion paper, we show how to characterize I-Wire constructs using a non-linear, active Hill model, which should be useful for qualifying cells prior to their use in cardiac regenerative medicine. Copyright © 2016 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  9. SWARM : a scientific workflow for supporting Bayesian approaches to improve metabolic models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, X.; Stevens, R.; Mathematics and Computer Science

    2008-01-01

    With the exponential growth of complete genome sequences, the analysis of these sequences is becoming a powerful approach to build genome-scale metabolic models. These models can be used to study individual molecular components and their relationships, and eventually study cells as systems. However, constructing genome-scale metabolic models manually is time-consuming and labor-intensive. This property of manual model-building process causes the fact that much fewer genome-scale metabolic models are available comparing to hundreds of genome sequences available. To tackle this problem, we design SWARM, a scientific workflow that can be utilized to improve genome-scale metabolic models in high-throughput fashion. SWARM dealsmore » with a range of issues including the integration of data across distributed resources, data format conversions, data update, and data provenance. Putting altogether, SWARM streamlines the whole modeling process that includes extracting data from various resources, deriving training datasets to train a set of predictors and applying Bayesian techniques to assemble the predictors, inferring on the ensemble of predictors to insert missing data, and eventually improving draft metabolic networks automatically. By the enhancement of metabolic model construction, SWARM enables scientists to generate many genome-scale metabolic models within a short period of time and with less effort.« less

  10. Feedback in a clinical setting: A way forward to enhance student's learning through constructive feedback.

    PubMed

    Sultan, Amber Shamim; Mateen Khan, Muhammad Arif

    2017-07-01

    Feedback is considered as a dynamic process in which information about the observed performance is used to promote the desirable behaviour and correct the negative ones. The importance of feedback is widely acknowledged, but still there seems to be inconsistency in the amount, type and timing of feedback received from the clinical faculty. No significant effort has been put forward from the educator end to empower the learners with the skills of receiving and using the feedback effectively. Some institutions conduct faculty development workshops and courses to facilitate the clinicians on how best to deliver constructive feedback to the learners. Despite of all these struggles learners are not fully satisfied with the quality of feedback received from their busy clinicians. The aim of this paper is to highlight what actually feedback is, type and structure of feedback, the essential components of a constructive feedback, benefits of providing feedback, barriers affecting the provision of timely feedback and different models used for providing feedback. The ultimate purpose of this paper is to provide sufficient information to the clinical directors that there is a need to establish a robust system for giving feedback to learners and to inform all the clinical educators with the skills required to provide constructive feedback to their learners. For the literature review, we had used the key words glossary as: Feedback, constructive feedback, barriers to feedback, principles of constructive feedback, Models of feedback, reflection, self-assessment and clinical practice etc. The data bases for the search include: Cardiff University library catalogue, Pub Med, Google Scholar, Web of Knowledge and Science direct.

  11. Research of Manufacture Time Management System Based on PLM

    NASA Astrophysics Data System (ADS)

    Jing, Ni; Juan, Zhu; Liangwei, Zhong

    This system is targeted by enterprises manufacturing machine shop, analyzes their business needs and builds the plant management information system of Manufacture time and Manufacture time information management. for manufacturing process Combined with WEB technology, based on EXCEL VBA development of methods, constructs a hybrid model based on PLM workshop Manufacture time management information system framework, discusses the functionality of the system architecture, database structure.

  12. Precision Timed Infrastructure: Design Challenges

    DTIC Science & Technology

    2013-09-19

    timing constructs Clock synchronization and communication PRET Machines Other Platforms Fig. 1. Conceptual overview of translation steps between...2002. [3] A. Benveniste and G. Berry. The Synchronous Approach to Reactive and Real- Time Systems. Proceedings of the IEEE, 79(9):1270–1282, 1991. [4] D...and E. Lee. A programming model for time - synchronized distributed real- time systems. In Real Time and Embedded Technology and Applications Symposium, 2007. RTAS’07. 13th IEEE, pages

  13. High-Accuracy Tidal Flat Digital Elevation Model Construction Using TanDEM-X Science Phase Data

    NASA Technical Reports Server (NTRS)

    Lee, Seung-Kuk; Ryu, Joo-Hyung

    2017-01-01

    This study explored the feasibility of using TanDEM-X (TDX) interferometric observations of tidal flats for digital elevation model (DEM) construction. Our goal was to generate high-precision DEMs in tidal flat areas, because accurate intertidal zone data are essential for monitoring coastal environment sand erosion processes. To monitor dynamic coastal changes caused by waves, currents, and tides, very accurate DEMs with high spatial resolution are required. The bi- and monostatic modes of the TDX interferometer employed during the TDX science phase provided a great opportunity for highly accurate intertidal DEM construction using radar interferometry with no time lag (bistatic mode) or an approximately 10-s temporal baseline (monostatic mode) between the master and slave synthetic aperture radar image acquisitions. In this study, DEM construction in tidal flat areas was first optimized based on the TDX system parameters used in various TDX modes. We successfully generated intertidal zone DEMs with 57-m spatial resolutions and interferometric height accuracies better than 0.15 m for three representative tidal flats on the west coast of the Korean Peninsula. Finally, we validated these TDX DEMs against real-time kinematic-GPS measurements acquired in two tidal flat areas; the correlation coefficient was 0.97 with a root mean square error of 0.20 m.

  14. Probabilistic seismic hazard in the San Francisco Bay area based on a simplified viscoelastic cycle model of fault interactions

    USGS Publications Warehouse

    Pollitz, F.F.; Schwartz, D.P.

    2008-01-01

    We construct a viscoelastic cycle model of plate boundary deformation that includes the effect of time-dependent interseismic strain accumulation, coseismic strain release, and viscoelastic relaxation of the substrate beneath the seismogenic crust. For a given fault system, time-averaged stress changes at any point (not on a fault) are constrained to zero; that is, kinematic consistency is enforced for the fault system. The dates of last rupture, mean recurrence times, and the slip distributions of the (assumed) repeating ruptures are key inputs into the viscoelastic cycle model. This simple formulation allows construction of stress evolution at all points in the plate boundary zone for purposes of probabilistic seismic hazard analysis (PSHA). Stress evolution is combined with a Coulomb failure stress threshold at representative points on the fault segments to estimate the times of their respective future ruptures. In our PSHA we consider uncertainties in a four-dimensional parameter space: the rupture peridocities, slip distributions, time of last earthquake (for prehistoric ruptures) and Coulomb failure stress thresholds. We apply this methodology to the San Francisco Bay region using a recently determined fault chronology of area faults. Assuming single-segment rupture scenarios, we find that fature rupture probabilities of area faults in the coming decades are the highest for the southern Hayward, Rodgers Creek, and northern Calaveras faults. This conclusion is qualitatively similar to that of Working Group on California Earthquake Probabilities, but the probabilities derived here are significantly higher. Given that fault rupture probabilities are highly model-dependent, no single model should be used to assess to time-dependent rupture probabilities. We suggest that several models, including the present one, be used in a comprehensive PSHA methodology, as was done by Working Group on California Earthquake Probabilities.

  15. Semiclassical matrix model for quantum chaotic transport with time-reversal symmetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novaes, Marcel, E-mail: marcel.novaes@gmail.com

    2015-10-15

    We show that the semiclassical approach to chaotic quantum transport in the presence of time-reversal symmetry can be described by a matrix model. In other words, we construct a matrix integral whose perturbative expansion satisfies the semiclassical diagrammatic rules for the calculation of transport statistics. One of the virtues of this approach is that it leads very naturally to the semiclassical derivation of universal predictions from random matrix theory.

  16. In situ patterned micro 3D liver constructs for parallel toxicology testing in a fluidic device

    PubMed Central

    Skardal, Aleksander; Devarasetty, Mahesh; Soker, Shay; Hall, Adam R

    2017-01-01

    3D tissue models are increasingly being implemented for drug and toxicology testing. However, the creation of tissue-engineered constructs for this purpose often relies on complex biofabrication techniques that are time consuming, expensive, and difficult to scale up. Here, we describe a strategy for realizing multiple tissue constructs in a parallel microfluidic platform using an approach that is simple and can be easily scaled for high-throughput formats. Liver cells mixed with a UV-crosslinkable hydrogel solution are introduced into parallel channels of a sealed microfluidic device and photopatterned to produce stable tissue constructs in situ. The remaining uncrosslinked material is washed away, leaving the structures in place. By using a hydrogel that specifically mimics the properties of the natural extracellular matrix, we closely emulate native tissue, resulting in constructs that remain stable and functional in the device during a 7-day culture time course under recirculating media flow. As proof of principle for toxicology analysis, we expose the constructs to ethyl alcohol (0–500 mM) and show that the cell viability and the secretion of urea and albumin decrease with increasing alcohol exposure, while markers for cell damage increase. PMID:26355538

  17. In situ patterned micro 3D liver constructs for parallel toxicology testing in a fluidic device.

    PubMed

    Skardal, Aleksander; Devarasetty, Mahesh; Soker, Shay; Hall, Adam R

    2015-09-10

    3D tissue models are increasingly being implemented for drug and toxicology testing. However, the creation of tissue-engineered constructs for this purpose often relies on complex biofabrication techniques that are time consuming, expensive, and difficult to scale up. Here, we describe a strategy for realizing multiple tissue constructs in a parallel microfluidic platform using an approach that is simple and can be easily scaled for high-throughput formats. Liver cells mixed with a UV-crosslinkable hydrogel solution are introduced into parallel channels of a sealed microfluidic device and photopatterned to produce stable tissue constructs in situ. The remaining uncrosslinked material is washed away, leaving the structures in place. By using a hydrogel that specifically mimics the properties of the natural extracellular matrix, we closely emulate native tissue, resulting in constructs that remain stable and functional in the device during a 7-day culture time course under recirculating media flow. As proof of principle for toxicology analysis, we expose the constructs to ethyl alcohol (0-500 mM) and show that the cell viability and the secretion of urea and albumin decrease with increasing alcohol exposure, while markers for cell damage increase.

  18. Comparision of photogrammetric point clouds with BIM building elements for construction progress monitoring

    NASA Astrophysics Data System (ADS)

    Tuttas, S.; Braun, A.; Borrmann, A.; Stilla, U.

    2014-08-01

    For construction progress monitoring a planned state of the construction at a certain time (as-planed) has to be compared to the actual state (as-built). The as-planed state is derived from a building information model (BIM), which contains the geometry of the building and the construction schedule. In this paper we introduce an approach for the generation of an as-built point cloud by photogrammetry. It is regarded that that images on a construction cannot be taken from everywhere it seems to be necessary. Because of this we use a combination of structure from motion process together with control points to create a scaled point cloud in a consistent coordinate system. Subsequently this point cloud is used for an as-built - as-planed comparison. For that voxels of an octree are marked as occupied, free or unknown by raycasting based on the triangulated points and the camera positions. This allows to identify not existing building parts. For the verification of the existence of building parts a second test based on the points in front and behind the as-planed model planes is performed. The proposed procedure is tested based on an inner city construction site under real conditions.

  19. Quantifying and modeling soil erosion and sediment export from construction sites in southern California

    NASA Astrophysics Data System (ADS)

    Wernet, A. K.; Beighley, R. E.

    2006-12-01

    Soil erosion is a power process that continuously alters the Earth's landscape. Human activities, such as construction and agricultural practices, and natural events, such as forest fires and landslides, disturb the landscape and intensify erosion processes leading to sudden increases in runoff sediment concentrations and degraded stream water quality. Understanding soil erosion and sediment transport processes is of great importance to researchers and practicing engineers, who routinely use models to predict soil erosion and sediment movement for varied land use and climate change scenarios. However, existing erosion models are limited in their applicability to constructions sites which have highly variable soil conditions (density, moisture, surface roughness, and best management practices) that change often in both space and time. The goal of this research is to improve the understanding, predictive capabilities and integration of treatment methodologies for controlling soil erosion and sediment export from construction sites. This research combines modeling with field monitoring and laboratory experiments to quantify: (a) spatial and temporal distribution of soil conditions on construction sites, (b) soil erosion due to event rainfall, and (c) potential offsite discharge of sediment with and without treatment practices. Field sites in southern California were selected to monitor the effects of common construction activities (ex., cut/fill, grading, foundations, roads) on soil conditions and sediment discharge. Laboratory experiments were performed in the Soil Erosion Research Laboratory (SERL), part of the Civil and Environmental Engineering department at San Diego State University, to quantify the impact of individual factors leading to sediment export. SERL experiments utilize a 3-m by 10-m tilting soil bed with soil depths up to 1 m, slopes ranging from 0 to 50 percent, and rainfall rates up to 150 mm/hr (6 in/hr). Preliminary modeling, field and laboratory results are presented.

  20. Construction of large signaling pathways using an adaptive perturbation approach with phosphoproteomic data.

    PubMed

    Melas, Ioannis N; Mitsos, Alexander; Messinis, Dimitris E; Weiss, Thomas S; Rodriguez, Julio-Saez; Alexopoulos, Leonidas G

    2012-04-01

    Construction of large and cell-specific signaling pathways is essential to understand information processing under normal and pathological conditions. On this front, gene-based approaches offer the advantage of large pathway exploration whereas phosphoproteomic approaches offer a more reliable view of pathway activities but are applicable to small pathway sizes. In this paper, we demonstrate an experimentally adaptive approach to construct large signaling pathways from phosphoproteomic data within a 3-day time frame. Our approach--taking advantage of the fast turnaround time of the xMAP technology--is carried out in four steps: (i) screen optimal pathway inducers, (ii) select the responsive ones, (iii) combine them in a combinatorial fashion to construct a phosphoproteomic dataset, and (iv) optimize a reduced generic pathway via an Integer Linear Programming formulation. As a case study, we uncover novel players and their corresponding pathways in primary human hepatocytes by interrogating the signal transduction downstream of 81 receptors of interest and constructing a detailed model for the responsive part of the network comprising 177 species (of which 14 are measured) and 365 interactions.

  1. Evapotranspiration versus oxygen intrusion: which is the main force in alleviating bioclogging of vertical-flow constructed wetlands during a resting operation?

    PubMed

    Hua, Guofen; Chen, Qiuwen; Kong, Jun; Li, Man

    2017-08-01

    Clogging is the most significant challenge limiting the application of constructed wetlands. Application of a forced resting period is a practical way to relieve clogging, particularly bioclogging. To reveal the alleviation mechanisms behind such a resting operation, evapotranspiration and oxygen flux were studied during a resting period in a laboratory vertical-flow constructed wetland model through physical simulation and numerical model analysis. In addition, the optimum theoretical resting duration was determined based on the time required for oxygen to completely fill the pores, i.e., formation of a sufficiently thick and completely dry layer. The results indicated that (1) evapotranspiration was not the key factor, but was a driving force in the alleviation of bioclogging; (2) the rate of oxygen diffusion into the pores was sufficient to oxidize and disperse the flocculant biofilm, which was essential to alleviate bioclogging. This study provides important insights into understanding how clogging/bioclogging can be alleviated in vertical-flow constructed wetlands. Graphical abstract Evapotranspiration versus oxygen intrusion in alleviating bioclogging in vertical flow constructed wetlands.

  2. Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjorn; Steinsland, Ingelin

    2014-05-01

    This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.

  3. Topology and incompleteness for 2+1-dimensional cosmological spacetimes

    NASA Astrophysics Data System (ADS)

    Fajman, David

    2017-06-01

    We study the long-time behavior of the Einstein flow coupled to matter on 2-dimensional surfaces. We consider massless matter models such as collisionless matter composed of massless particles, massless scalar fields and radiation fluids and show that the maximal globally hyperbolic development of homogeneous and isotropic initial data on the 2-sphere is geodesically incomplete in both time directions, i.e. the spacetime recollapses. This behavior also holds for open sets of initial data. In particular, we construct classes of recollapsing 2+1-dimensional spacetimes with spherical spatial topology which provide evidence for a closed universe recollapse conjecture for massless matter models in 2+1 dimensions. Furthermore, we construct solutions with toroidal and higher genus topology for the massless matter fields, which in both cases are future complete. The spacetimes with toroidal topology are 2+1-dimensional analogies of the Einstein-de Sitter model. In addition, we point out a general relation between the energy-momentum tensor and the Kretschmann scalar in 2+1 dimensions and use it to infer strong cosmic censorship for all these models. In view of this relation, we also recall corresponding models containing massive particles, constructed in a previous work and determine the nature of their initial singularities. We conclude that the global structure of non-vacuum cosmological spacetimes in 2+1 dimensions is determined by the mass of particles and—in the homogeneous and isotropic setting studied here—verifies strong cosmic censorship.

  4. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    PubMed Central

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678

  5. Automated construction of node software using attributes in a ubiquitous sensor network environment.

    PubMed

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.

  6. Spectral Collocation Time-Domain Modeling of Diffractive Optical Elements

    NASA Astrophysics Data System (ADS)

    Hesthaven, J. S.; Dinesen, P. G.; Lynov, J. P.

    1999-11-01

    A spectral collocation multi-domain scheme is developed for the accurate and efficient time-domain solution of Maxwell's equations within multi-layered diffractive optical elements. Special attention is being paid to the modeling of out-of-plane waveguide couplers. Emphasis is given to the proper construction of high-order schemes with the ability to handle very general problems of considerable geometric and material complexity. Central questions regarding efficient absorbing boundary conditions and time-stepping issues are also addressed. The efficacy of the overall scheme for the time-domain modeling of electrically large, and computationally challenging, problems is illustrated by solving a number of plane as well as non-plane waveguide problems.

  7. Potential of Progressive Construction Systems in Slovakia

    NASA Astrophysics Data System (ADS)

    Kozlovska, Maria; Spisakova, Marcela; Mackova, Daniela

    2017-10-01

    Construction industry is a sector with rapid development. Progressive technologies of construction and new construction materials also called modern methods of construction (MMC) are developed constantly. MMC represent the adoption of construction industrialisation and the use of prefabrication of components in building construction. One of these modern methods is also system Varianthaus, which is based on, insulated concrete forms principle and provides complete production plant for wall, ceiling and roof elements for a high thermal insulation house construction. Another progressive construction system is EcoB, which represents an insulated precast concrete panel based on combination of two layers, insulation and concrete, produced in a factory as a whole. Both modern methods of construction are not yet known and wide-spread in the Slovak construction market. The aim of this paper is focused on demonstration of MMC using potential in Slovakia. MMC potential is proved based on comparison of the selected parameters of construction process - construction costs and construction time. The subject of this study is family house modelled in three material variants - masonry construction (as a representative of traditional methods of construction), Varianthaus and EcoB (as the representatives of modern methods of construction). The results of this study provide the useful information in decision-making process for potential investors of construction.

  8. Applications of the Theory of Distributed and Real Time Systems to the Development of Large-Scale Timing Based Systems.

    DTIC Science & Technology

    1996-04-01

    time systems . The focus is on the study of ’building-blocks’ for the construction of reliable and efficient systems. Our works falls into three...Members of MIT’s Theory of Distributed Systems group have continued their work on modelling, designing, verifying and analyzing distributed and real

  9. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  10. 40 CFR 60.1590 - When must I complete each increment of progress?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule...

  11. Research on Chinese Intelligent Guanglianda in High-rise Construction

    NASA Astrophysics Data System (ADS)

    Romanovich, Marina; Shi, Peiyu; Wang, Ziyue; Zhao, Liang; Gerasimova, Vera

    2018-03-01

    Since the reform and opening up in China, the rapid development of China's construction industry, especially after 2011, BIM technology in China has a rapid development and depth of application. This article mainly discusses the process of development of China's construction industry management and the process of development of BIM technology in China and, the use of software in China, the main use of revit in 3D modeling (design stage), the use of Guanglianda in the whole process of construction management. This article researches on Guanglianda and the applications were described, for example, in the integrated project of management information, labor management information, construction video surveillance, material management, material site acceptance. This paper also elaborates on the cooperation of Revit and Guanglianda in different stages. This paper also elaborates on a construction case in China, and makes a description of the problems of crane, facts and time, and then explores the reasons behind the rapid construction of China. And the feasibility of China's construction management approach to the introduction of Russia.

  12. Earthworks logistics in the high density urban development conditions - case study

    NASA Astrophysics Data System (ADS)

    Sobotka, A.; Blajer, M.

    2017-10-01

    Realisation of the construction projects on highly urbanised areas carries many difficulties and logistic problems. Earthworks conducted in such conditions constitute a good example of how important it is to properly plan the works and use the technical means of the logistics infrastructure. The construction processes on the observed construction site, in combination with their external logistics service are a complex system, difficult for mathematical modelling and achievement of appropriate data for planning the works. The paper shows describe and analysis of earthworks during construction of the Centre of Power Engineering of AGH in Krakow for two stages of a construction project. At the planning stage in the preparatory phase (before realization) and in the implementation phase of construction works (foundation). In the first case, an example of the use of queuing theory for prediction of excavation time under random work conditions of the excavator and the associated trucks is provided. In the second case there is a change of foundation works technology resulting as a consequence of changes in logistics earthworks. Observation of the construction has confirmed that the use of appropriate methods of construction works management, and in this case agile management, the time and cost of the project have not been exceeded. The success of a project depends on the ability of the contractor to react quickly when changes occur in the design, technology, environment, etc.

  13. Dynamic compaction of granular materials

    PubMed Central

    Favrie, N.; Gavrilyuk, S.

    2013-01-01

    An Eulerian hyperbolic multiphase flow model for dynamic and irreversible compaction of granular materials is constructed. The reversible model is first constructed on the basis of the classical Hertz theory. The irreversible model is then derived in accordance with the following two basic principles. First, the entropy inequality is satisfied by the model. Second, the corresponding ‘intergranular stress’ coming from elastic energy owing to contact between grains decreases in time (the granular media behave as Maxwell-type materials). The irreversible model admits an equilibrium state corresponding to von Mises-type yield limit. The yield limit depends on the volume fraction of the solid. The sound velocity at the yield surface is smaller than that in the reversible model. The last one is smaller than the sound velocity in the irreversible model. Such an embedded model structure assures a thermodynamically correct formulation of the model of granular materials. The model is validated on quasi-static experiments on loading–unloading cycles. The experimentally observed hysteresis phenomena were numerically confirmed with a good accuracy by the proposed model. PMID:24353466

  14. Computational model for the analysis of cartilage and cartilage tissue constructs

    PubMed Central

    Smith, David W.; Gardiner, Bruce S.; Davidson, John B.; Grodzinsky, Alan J.

    2013-01-01

    We propose a new non-linear poroelastic model that is suited to the analysis of soft tissues. In this paper the model is tailored to the analysis of cartilage and the engineering design of cartilage constructs. The proposed continuum formulation of the governing equations enables the strain of the individual material components within the extracellular matrix (ECM) to be followed over time, as the individual material components are synthesized, assembled and incorporated within the ECM or lost through passive transport or degradation. The material component analysis developed here naturally captures the effect of time-dependent changes of ECM composition on the deformation and internal stress states of the ECM. For example, it is shown that increased synthesis of aggrecan by chondrocytes embedded within a decellularized cartilage matrix initially devoid of aggrecan results in osmotic expansion of the newly synthesized proteoglycan matrix and tension within the structural collagen network. Specifically, we predict that the collagen network experiences a tensile strain, with a maximum of ~2% at the fixed base of the cartilage. The analysis of an example problem demonstrates the temporal and spatial evolution of the stresses and strains in each component of a self-equilibrating composite tissue construct, and the role played by the flux of water through the tissue. PMID:23784936

  15. Emergence of Landauer transport from quantum dynamics: A model Hamiltonian approach

    NASA Astrophysics Data System (ADS)

    Pal, Partha Pratim; Ramakrishna, S.; Seideman, Tamar

    2018-04-01

    The Landauer expression for computing current-voltage characteristics in nanoscale devices is efficient but not suited to transient phenomena and a time-dependent current because it is applicable only when the charge carriers transition into a steady flux after an external perturbation. In this article, we construct a very general expression for time-dependent current in an electrode-molecule-electrode arrangement. Utilizing a model Hamiltonian (consisting of the subsystem energy levels and their electronic coupling terms), we propagate the Schrödinger wave function equation to numerically compute the time-dependent population in the individual subsystems. The current in each electrode (defined in terms of the rate of change of the corresponding population) has two components, one due to the charges originating from the same electrode and the other due to the charges initially residing at the other electrode. We derive an analytical expression for the first component and illustrate that it agrees reasonably with its numerical counterpart at early times. Exploiting the unitary evolution of a wavefunction, we construct a more general Landauer style formula and illustrate the emergence of Landauer transport from our simulations without the assumption of time-independent charge flow. Our generalized Landauer formula is valid at all times for models beyond the wide-band limit, non-uniform electrode density of states and for time and energy-dependent electronic coupling between the subsystems. Subsequently, we investigate the ingredients in our model that regulate the onset time scale of this steady state. We compare the performance of our general current expression with the Landauer current for time-dependent electronic coupling. Finally, we comment on the applicability of the Landauer formula to compute hot-electron current arising upon plasmon decoherence.

  16. Emergence of Landauer transport from quantum dynamics: A model Hamiltonian approach.

    PubMed

    Pal, Partha Pratim; Ramakrishna, S; Seideman, Tamar

    2018-04-14

    The Landauer expression for computing current-voltage characteristics in nanoscale devices is efficient but not suited to transient phenomena and a time-dependent current because it is applicable only when the charge carriers transition into a steady flux after an external perturbation. In this article, we construct a very general expression for time-dependent current in an electrode-molecule-electrode arrangement. Utilizing a model Hamiltonian (consisting of the subsystem energy levels and their electronic coupling terms), we propagate the Schrödinger wave function equation to numerically compute the time-dependent population in the individual subsystems. The current in each electrode (defined in terms of the rate of change of the corresponding population) has two components, one due to the charges originating from the same electrode and the other due to the charges initially residing at the other electrode. We derive an analytical expression for the first component and illustrate that it agrees reasonably with its numerical counterpart at early times. Exploiting the unitary evolution of a wavefunction, we construct a more general Landauer style formula and illustrate the emergence of Landauer transport from our simulations without the assumption of time-independent charge flow. Our generalized Landauer formula is valid at all times for models beyond the wide-band limit, non-uniform electrode density of states and for time and energy-dependent electronic coupling between the subsystems. Subsequently, we investigate the ingredients in our model that regulate the onset time scale of this steady state. We compare the performance of our general current expression with the Landauer current for time-dependent electronic coupling. Finally, we comment on the applicability of the Landauer formula to compute hot-electron current arising upon plasmon decoherence.

  17. Time Is Brain: The Stroke Theory of Relativity.

    PubMed

    Gomez, Camilo R

    2018-04-25

    Since the introduction of the philosophical tenet "Time is Brain!," multiple lines of research have demonstrated that other factors contribute to the degree of ischemic injury at any one point in time, and it is now clear that the therapeutic window of acute ischemic stroke is more protracted than it was first suspected. To define a more realistic relationship between time and the ischemic process, we used computational modeling to assess how these 2 variables are affected by collateral circulatory competence. Starting from the premise that the expression "Time=Brain" is mathematically false, we reviewed the existing literature on the attributes of cerebral ischemia over time, with particular attention to relevant clinical parameters, and the effect of different variables, particularly collateral circulation, on the time-ischemia relationship. We used this information to construct a theoretical computational model and applied it to categorically different yet abnormal cerebral perfusion scenarios, allowing comparison of their behavior both overall (i.e., final infarct volume) and in real-time (i.e., instantaneous infarct growth rate). Optimal collateral circulatory competence was predictably associated with slower infarct growth rates and prolongation of therapeutic window. Modeling of identifiable specific types of perfusion maps allows forecasting of the fate of the ischemic process over time. Distinct cerebral perfusion map patterns can be readily identified in patients with acute ischemic stroke. These patterns have inherently different behaviors relative to the time-ischemia construct, allowing the possibility of improving parsing and treatment allocation. It is clearly evident that the effect of time on the ischemic process is relative. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  18. Finite difference time domain modeling of spiral antennas

    NASA Technical Reports Server (NTRS)

    Penney, Christopher W.; Beggs, John H.; Luebbers, Raymond J.

    1992-01-01

    The objectives outlined in the original proposal for this project were to create a well-documented computer analysis model based on the finite-difference, time-domain (FDTD) method that would be capable of computing antenna impedance, far-zone radiation patterns, and radar cross-section (RCS). The ability to model a variety of penetrable materials in addition to conductors is also desired. The spiral antennas under study by this project meet these requirements since they are constructed of slots cut into conducting surfaces which are backed by dielectric materials.

  19. MIMICKING COUNTERFACTUAL OUTCOMES TO ESTIMATE CAUSAL EFFECTS.

    PubMed

    Lok, Judith J

    2017-04-01

    In observational studies, treatment may be adapted to covariates at several times without a fixed protocol, in continuous time. Treatment influences covariates, which influence treatment, which influences covariates, and so on. Then even time-dependent Cox-models cannot be used to estimate the net treatment effect. Structural nested models have been applied in this setting. Structural nested models are based on counterfactuals: the outcome a person would have had had treatment been withheld after a certain time. Previous work on continuous-time structural nested models assumes that counterfactuals depend deterministically on observed data, while conjecturing that this assumption can be relaxed. This article proves that one can mimic counterfactuals by constructing random variables, solutions to a differential equation, that have the same distribution as the counterfactuals, even given past observed data. These "mimicking" variables can be used to estimate the parameters of structural nested models without assuming the treatment effect to be deterministic.

  20. Dam Construction in Lancang-Mekong River Basin Could Mitigate Future Flood Risk From Warming-Induced Intensified Rainfall: Dam Mitigate Flood Risk in Mekong

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Wei; Lu, Hui; Ruby Leung, L.

    Water resources management, in particular flood control, in the Mekong River Basin (MRB) faces two key challenges in the 21st century: climate change and dam construction. A large scale distributed Geomorphology-Based Hydrological Model coupled with a simple reservoir regulation model (GBHM-MK-SOP) is used to investigate the relative effects of climate change and dam construction on the flood characteristics in the MRB. Results suggest an increase in both flood magnitude and frequency under climate change, which is more severe in the upstream basin and increases over time. However, dam construction and stream regulation reduce flood risk consistently throughout this century, withmore » more obvious effects in the upstream basin where larger reservoirs will be located. The flood mitigation effect of dam regulation dominates over the flood intensification effect of climate change before 2060, but the latter emerges more prominently after 2060 and dominates the flood risk especially in the lower basin.« less

  1. Development and evaluation of a semi-empirical two-zone dust exposure model for a dusty construction trade.

    PubMed

    Jones, Rachael M; Simmons, Catherine; Boelter, Fred

    2011-06-01

    Drywall finishing is a dusty construction activity. We describe a mathematical model that predicts the time-weighted average concentration of respirable and total dusts in the personal breathing zone of the sander, and in the area surrounding joint compound sanding activities. The model represents spatial variation in dust concentrations using two-zones, and temporal variation using an exponential function. Interzone flux and the relationships between respirable and total dusts are described using empirical factors. For model evaluation, we measured dust concentrations in two field studies, including three workers from a commercial contracting crew, and one unskilled worker. Data from the field studies confirm that the model assumptions and parameterization are reasonable and thus validate the modeling approach. Predicted dust C(twa) were in concordance with measured values for the contracting crew, but under estimated measured values for the unskilled worker. Further characterization of skill-related exposure factors is indicated.

  2. Postponed bifurcations of a ring-laser model with a swept parameter and additive colored noise

    NASA Astrophysics Data System (ADS)

    Mannella, R.; Moss, Frank; McClintock, P. V. E.

    1987-03-01

    The paper presents measurements of the time evolution of the statistical densities of both amplitude and field intensity obtained from a colored-noise-driven electronic circuit model of a ring laser, as the bifurcation parameter is swept through its critical values. The time-dependent second moments (intensities) were obtained from the densities. In addition, the individual stochastic trajectories were available from which the distribution of bifurcation times was constructed. For short-correlation-time (quasiwhite) noise the present results are in quantitative agreement with the recent calculations of Bogi, Colombo, Lugiato, and Mandel (1986). New results for long noise correlation times are obtained.

  3. Constructing a Teleseismic Tomographic Image of Taiwan using BATS Recordings

    NASA Astrophysics Data System (ADS)

    Krajewski, J.; Roecker, S.

    2005-12-01

    Taiwan is an evolving arc-continent collision located at a complicated part of the plate boundary between the Eurasian and Philippine Sea plates. To better understand the role of the upper mantle in the dynamics of this collision, we reviewed 4 years of data from the Broadband Array in Taiwan for Seismology (BATS) in Taiwan to construct a teleseismic dataset for tomographic imaging of the subsurface of the island. From an initial selection of approximately 300 events, we used waveform correlation to generate a dataset of 4500 relative arrival times. To calculate accurate travel times in three dimensional wavespeed models over the large lateral distances in our model (~800 km), we solve the eikonal equation directly in a spherical coordinate system. We reduce the influence of smearing of crustal heterogeneity into the deeper mantle, we fix the upper 30 km to a previously determined P wavespeed model for the region. Initial resolution tests suggest a spatial limit on the order of 40 km.

  4. Analysis of longitudinal marginal structural models.

    PubMed

    Bryan, Jenny; Yu, Zhuo; Van Der Laan, Mark J

    2004-07-01

    In this article we construct and study estimators of the causal effect of a time-dependent treatment on survival in longitudinal studies. We employ a particular marginal structural model (MSM), proposed by Robins (2000), and follow a general methodology for constructing estimating functions in censored data models. The inverse probability of treatment weighted (IPTW) estimator of Robins et al. (2000) is used as an initial estimator and forms the basis for an improved, one-step estimator that is consistent and asymptotically linear when the treatment mechanism is consistently estimated. We extend these methods to handle informative censoring. The proposed methodology is employed to estimate the causal effect of exercise on mortality in a longitudinal study of seniors in Sonoma County. A simulation study demonstrates the bias of naive estimators in the presence of time-dependent confounders and also shows the efficiency gain of the IPTW estimator, even in the absence such confounding. The efficiency gain of the improved, one-step estimator is demonstrated through simulation.

  5. A hybrid feature selection and health indicator construction scheme for delay-time-based degradation modelling of rolling element bearings

    NASA Astrophysics Data System (ADS)

    Zhang, Bin; Deng, Congying; Zhang, Yi

    2018-03-01

    Rolling element bearings are mechanical components used frequently in most rotating machinery and they are also vulnerable links representing the main source of failures in such systems. Thus, health condition monitoring and fault diagnosis of rolling element bearings have long been studied to improve operational reliability and maintenance efficiency of rotatory machines. Over the past decade, prognosis that enables forewarning of failure and estimation of residual life attracted increasing attention. To accurately and efficiently predict failure of the rolling element bearing, the degradation requires to be well represented and modelled. For this purpose, degradation of the rolling element bearing is analysed with the delay-time-based model in this paper. Also, a hybrid feature selection and health indicator construction scheme is proposed for extraction of the bearing health relevant information from condition monitoring sensor data. Effectiveness of the presented approach is validated through case studies on rolling element bearing run-to-failure experiments.

  6. Parametric estimation for reinforced concrete relief shelter for Aceh cases

    NASA Astrophysics Data System (ADS)

    Atthaillah; Saputra, Eri; Iqbal, Muhammad

    2018-05-01

    This paper was a work in progress (WIP) to discover a rapid parametric framework for post-disaster permanent shelter’s materials estimation. The intended shelters were reinforced concrete construction with bricks as its wall. Inevitably, in post-disaster cases, design variations were needed to help suited victims condition. It seemed impossible to satisfy a beneficiary with a satisfactory design utilizing the conventional method. This study offered a parametric framework to overcome slow construction-materials estimation issue against design variations. Further, this work integrated parametric tool, which was Grasshopper to establish algorithms that simultaneously model, visualize, calculate and write the calculated data to a spreadsheet in a real-time. Some customized Grasshopper components were created using GHPython scripting for a more optimized algorithm. The result from this study was a partial framework that successfully performed modeling, visualization, calculation and writing the calculated data simultaneously. It meant design alterations did not escalate time needed for modeling, visualization, and material estimation. Further, the future development of the parametric framework will be made open source.

  7. Starobinsky-like inflation and neutrino masses in a no-scale SO(10) model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellis, John; Theoretical Physics Department, CERN,CH-1211 Geneva 23; Garcia, Marcos A.G.

    2016-11-08

    Using a no-scale supergravity framework, we construct an SO(10) model that makes predictions for cosmic microwave background observables similar to those of the Starobinsky model of inflation, and incorporates a double-seesaw model for neutrino masses consistent with oscillation experiments and late-time cosmology. We pay particular attention to the behaviour of the scalar fields during inflation and the subsequent reheating.

  8. Predicting sugar-sweetened behaviours with theory of planned behaviour constructs: Outcome and process results from the SIPsmartER behavioural intervention.

    PubMed

    Zoellner, Jamie M; Porter, Kathleen J; Chen, Yvonnes; Hedrick, Valisa E; You, Wen; Hickman, Maja; Estabrooks, Paul A

    2017-05-01

    Guided by the theory of planned behaviour (TPB) and health literacy concepts, SIPsmartER is a six-month multicomponent intervention effective at improving SSB behaviours. Using SIPsmartER data, this study explores prediction of SSB behavioural intention (BI) and behaviour from TPB constructs using: (1) cross-sectional and prospective models and (2) 11 single-item assessments from interactive voice response (IVR) technology. Quasi-experimental design, including pre- and post-outcome data and repeated-measures process data of 155 intervention participants. Validated multi-item TPB measures, single-item TPB measures, and self-reported SSB behaviours. Hypothesised relationships were investigated using correlation and multiple regression models. TPB constructs explained 32% of the variance cross sectionally and 20% prospectively in BI; and explained 13-20% of variance cross sectionally and 6% prospectively. Single-item scale models were significant, yet explained less variance. All IVR models predicting BI (average 21%, range 6-38%) and behaviour (average 30%, range 6-55%) were significant. Findings are interpreted in the context of other cross-sectional, prospective and experimental TPB health and dietary studies. Findings advance experimental application of the TPB, including understanding constructs at outcome and process time points and applying theory in all intervention development, implementation and evaluation phases.

  9. Correlating Trainee Attributes to Performance in 3D CAD Training

    ERIC Educational Resources Information Center

    Hamade, Ramsey F.; Artail, Hassan A.; Sikstrom, Sverker

    2007-01-01

    Purpose: The purpose of this exploratory study is to identify trainee attributes relevant for development of skills in 3D computer-aided design (CAD). Design/methodology/approach: Participants were trained to perform cognitive tasks of comparable complexity over time. Performance data were collected on the time needed to construct test models, and…

  10. Erosion over time on severely disturbed granitic soils: a model

    Treesearch

    W. F. Megahan

    1974-01-01

    A negative exponential equation containing three parameters was derived to describe time trends in surface erosion on severely disturbed soils. Data from four different studies of surface erosion on roads constructed from the granitic materials found in the Idaho Batholith were used to develop equation parameters. The evidence suggests that surface "armoring...

  11. a Real-Time Computer Music Synthesis System

    NASA Astrophysics Data System (ADS)

    Lent, Keith Henry

    A real time sound synthesis system has been developed at the Computer Music Center of The University of Texas at Austin. This system consists of several stand alone processors that were constructed jointly with White Instruments in Austin. These processors can be programmed as general purpose computers, but are provided with a number of specialized interfaces including: MIDI, 8 bit parallel, high speed serial, 2 channels analog input (18 bit A/Ds, 48kHz sample rate), and 4 channels analog output (18 bit D/As). In addition, a basic music synthesis language (Music56000) has been written in assembly code. On top of this, a symbolic compiler (PatchWork) has been developed to enable algorithms which run in these processors to be created graphically. And finally, a number of efficient time domain numerical models have been developed to enable the construction, simulation, control, and synthesis of many musical acoustics systems in real time on these processors. Specifically, assembly language models for cylindrical and conical horn sections, dissipative losses, tone holes, bells, and a number of linear and nonlinear boundary conditions have been developed.

  12. It's all in the timing: modeling isovolumic contraction through development and disease with a dynamic dual electromechanical bioreactor system.

    PubMed

    Morgan, Kathy Ye; Black, Lauren Deems

    2014-01-01

    This commentary discusses the rationale behind our recently reported work entitled "Mimicking isovolumic contraction with combined electromechanical stimulation improves the development of engineered cardiac constructs," introduces new data supporting our hypothesis, and discusses future applications of our bioreactor system. The ability to stimulate engineered cardiac tissue in a bioreactor system that combines both electrical and mechanical stimulation offers a unique opportunity to simulate the appropriate dynamics between stretch and contraction and model isovolumic contraction in vitro. Our previous study demonstrated that combined electromechanical stimulation that simulated the timing of isovolumic contraction in healthy tissue improved force generation via increased contractile and calcium handling protein expression and improved hypertrophic pathway activation. In new data presented here, we further demonstrate that modification of the timing between electrical and mechanical stimulation to mimic a non-physiological process negatively impacts the functionality of the engineered constructs. We close by exploring the various disease states that have altered timing between the electrical and mechanical stimulation signals as potential future directions for the use of this system.

  13. 40 CFR 60.2994 - Are air curtain incinerators regulated under this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before... percent wood waste. (2) 100 percent clean lumber. (3) 100 percent yard waste. (4) 100 percent mixture of only wood waste, clean lumber, and/or yard waste. Model Rule—Use of Model Rule ...

  14. In Search of the Nordic Model in Education

    ERIC Educational Resources Information Center

    Antikainen, Ari

    2006-01-01

    The Nordic model of education is defined in this article as an attempt to construct a national education system on the foundation of specific local values and practices, but at the same time subject to international influences. According to the author, equity, participation, and welfare are the major goals and the publicly funded comprehensive…

  15. Impact of a Flexible Evaluation System on Effort and Timing of Study

    ERIC Educational Resources Information Center

    Pacharn, Parunchana; Bay, Darlene; Felton, Sandra

    2012-01-01

    This paper examines results of a flexible grading system that allows each student to influence the weight allocated to each performance measure. We construct a stylized model to determine students' optimal responses. Our analytical model predicts different optimal strategies for students with varying academic abilities: a frontloading strategy for…

  16. Construction of regulatory networks using expression time-series data of a genotyped population.

    PubMed

    Yeung, Ka Yee; Dombek, Kenneth M; Lo, Kenneth; Mittler, John E; Zhu, Jun; Schadt, Eric E; Bumgarner, Roger E; Raftery, Adrian E

    2011-11-29

    The inference of regulatory and biochemical networks from large-scale genomics data is a basic problem in molecular biology. The goal is to generate testable hypotheses of gene-to-gene influences and subsequently to design bench experiments to confirm these network predictions. Coexpression of genes in large-scale gene-expression data implies coregulation and potential gene-gene interactions, but provide little information about the direction of influences. Here, we use both time-series data and genetics data to infer directionality of edges in regulatory networks: time-series data contain information about the chronological order of regulatory events and genetics data allow us to map DNA variations to variations at the RNA level. We generate microarray data measuring time-dependent gene-expression levels in 95 genotyped yeast segregants subjected to a drug perturbation. We develop a Bayesian model averaging regression algorithm that incorporates external information from diverse data types to infer regulatory networks from the time-series and genetics data. Our algorithm is capable of generating feedback loops. We show that our inferred network recovers existing and novel regulatory relationships. Following network construction, we generate independent microarray data on selected deletion mutants to prospectively test network predictions. We demonstrate the potential of our network to discover de novo transcription-factor binding sites. Applying our construction method to previously published data demonstrates that our method is competitive with leading network construction algorithms in the literature.

  17. Theoretical framework of the causes of construction time and cost overruns

    NASA Astrophysics Data System (ADS)

    Ullah, K.; Abdullah, A. H.; Nagapan, S.; Suhoo, S.; Khan, M. S.

    2017-11-01

    Any construction practitioner fundamental goal is to complete the projects within estimated duration and budgets, and expected quality targets. However, time and cost overruns are regular and universal phenomenon in construction projects and the construction projects in Malaysia has no exemption from the problems of time overrun and cost overrun. In order to accomplish the successful completion of construction projects on specified time and within planned cost, there are various factors that should be given serious attention so that issues such as time and cost overrun can be addressed. This paper aims to construct a framework for the causes of time overrun and cost overrun in construction projects of Malaysia. Based on the relevant literature review, causative factors of time overrun and cost overrun in Malaysian construction projects are summarized and the theoretical frameworks of the causes of construction time overrun and cost overrun is constructed. The developed frameworks for construction time and cost overruns based on the existing literature will assist the construction practitioners to plan the efficient approaches for achieving successful completion of the projects.

  18. Soft sensor modelling by time difference, recursive partial least squares and adaptive model updating

    NASA Astrophysics Data System (ADS)

    Fu, Y.; Yang, W.; Xu, O.; Zhou, L.; Wang, J.

    2017-04-01

    To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately.

  19. USA National Phenology Network’s volunteer-contributed observations yield predictive models of phenological transitions

    USGS Publications Warehouse

    Crimmins, Theresa M.; Crimmins, Michael A.; Gerst, Katherine L.; Rosemartin, Alyssa H.; Weltzin, Jake F.

    2017-01-01

    In support of science and society, the USA National Phenology Network (USA-NPN) maintains a rapidly growing, continental-scale, species-rich dataset of plant and animal phenology observations that with over 10 million records is the largest such database in the United States. Contributed voluntarily by professional and citizen scientists, these opportunistically collected observations are characterized by spatial clustering, inconsistent spatial and temporal sampling, and short temporal depth. We explore the potential for developing models of phenophase transitions suitable for use at the continental scale, which could be applied to a wide range of resource management contexts. We constructed predictive models of the onset of breaking leaf buds, leaves, open flowers, and ripe fruits – phenophases that are the most abundant in the database and also relevant to management applications – for all species with available data, regardless of plant growth habit, location, geographic extent, or temporal depth of the observations. We implemented a very basic model formulation - thermal time models with a fixed start date. Sufficient data were available to construct 107 individual species × phenophase models. Of these, fifteen models (14%) met our criteria for model fit and error and were suitable for use across the majority of the species’ geographic ranges. These findings indicate that the USA-NPN dataset holds promise for further and more refined modeling efforts. Further, the candidate models that emerged could be used to produce real-time and short-term forecast maps of the timing of such transitions to directly support natural resource management.

  20. On the classification of the spectrally stable standing waves of the Hartree problem

    NASA Astrophysics Data System (ADS)

    Georgiev, Vladimir; Stefanov, Atanas

    2018-05-01

    We consider the fractional Hartree model, with general power non-linearity and arbitrary spatial dimension. We construct variationally the "normalized" solutions for the corresponding Choquard-Pekar model-in particular a number of key properties, like smoothness and bell-shapedness are established. As a consequence of the construction, we show that these solitons are spectrally stable as solutions to the time-dependent Hartree model. In addition, we analyze the spectral stability of the Moroz-Van Schaftingen solitons of the classical Hartree problem, in any dimensions and power non-linearity. A full classification is obtained, the main conclusion of which is that only and exactly the "normalized" solutions (which exist only in a portion of the range) are spectrally stable.

  1. Cyclic completion of the anamorphic universe

    NASA Astrophysics Data System (ADS)

    Ijjas, Anna

    2018-04-01

    Cyclic models of the universe have the advantage of avoiding initial conditions problems related to postulating any sort of beginning in time. To date, the best known viable examples of cyclic models have been ekpyrotic. In this paper, we show that the recently proposed anamorphic scenario can also be made cyclic. The key to the cyclic completion is a classically stable, non-singular bounce. Remarkably, even though the bounce construction was originally developed to connect a period of contraction with a period of expansion both described by Einstein gravity, we show here that it can naturally be modified to connect an ordinary contracting phase described by Einstein gravity with a phase of anamorphic smoothing. The paper will present the basic principles and steps in constructing cyclic anamorphic models.

  2. Constructing and Modifying Sequence Statistics for relevent Using informR in 𝖱

    PubMed Central

    Marcum, Christopher Steven; Butts, Carter T.

    2015-01-01

    The informR package greatly simplifies the analysis of complex event histories in 𝖱 by providing user friendly tools to build sufficient statistics for the relevent package. Historically, building sufficient statistics to model event sequences (of the form a→b) using the egocentric generalization of Butts’ (2008) relational event framework for modeling social action has been cumbersome. The informR package simplifies the construction of the complex list of arrays needed by the rem() model fitting for a variety of cases involving egocentric event data, multiple event types, and/or support constraints. This paper introduces these tools using examples from real data extracted from the American Time Use Survey. PMID:26185488

  3. a Study about Terrestrial Laser Scanning for Reconstruction of Precast Concrete to Support Qlassic Assessment

    NASA Astrophysics Data System (ADS)

    Aziz, M. A.; Idris, K. M.; Majid, Z.; Ariff, M. F. M.; Yusoff, A. R.; Luh, L. C.; Abbas, M. A.; Chong, A. K.

    2016-09-01

    Nowadays, terrestrial laser scanning shows the potential to improve construction productivity by measuring the objects changes using real-time applications. This paper presents the process of implementation of an efficient framework for precast concrete using terrestrial laser scanning that enables contractors to acquire accurate data and support Quality Assessment System in Construction (QLASSIC). Leica Scanstation C10, black/white target, Autodesk Revit and Cyclone software were used in this study. The results were compared with the dimensional of based model precast concrete given by the company as a reference with the AutoDesk Revit model from the terrestrial laser scanning data and conventional method (measuring tape). To support QLASSIC, the tolerance dimensions of cast in-situ & precast elements is +10mm / -5mm. The results showed that the root mean square error for a Revit model is 2.972mm while using measuring tape is 13.687mm. The accuracy showed that terrestrial laser scanning has an advantage in construction jobs to support QLASSIC.

  4. Application of the predicted heat strain model in development of localized, threshold-based heat stress management guidelines for the construction industry.

    PubMed

    Rowlinson, Steve; Jia, Yunyan Andrea

    2014-04-01

    Existing heat stress risk management guidelines recommended by international standards are not practical for the construction industry which needs site supervision staff to make instant managerial decisions to mitigate heat risks. The ability of the predicted heat strain (PHS) model [ISO 7933 (2004). Ergonomics of the thermal environment analytical determination and interpretation of heat stress using calculation of the predicted heat strain. Geneva: International Standard Organisation] to predict maximum allowable exposure time (D lim) has now enabled development of localized, action-triggering and threshold-based guidelines for implementation by lay frontline staff on construction sites. This article presents a protocol for development of two heat stress management tools by applying the PHS model to its full potential. One of the tools is developed to facilitate managerial decisions on an optimized work-rest regimen for paced work. The other tool is developed to enable workers' self-regulation during self-paced work.

  5. A filter-mediated communication model for design collaboration in building construction.

    PubMed

    Lee, Jaewook; Jeong, Yongwook; Oh, Minho; Hong, Seung Wan

    2014-01-01

    Multidisciplinary collaboration is an important aspect of modern engineering activities, arising from the growing complexity of artifacts whose design and construction require knowledge and skills that exceed the capacities of any one professional. However, current collaboration in the architecture, engineering, and construction industries often fails due to lack of shared understanding between different participants and limitations of their supporting tools. To achieve a high level of shared understanding, this study proposes a filter-mediated communication model. In the proposed model, participants retain their own data in the form most appropriate for their needs with domain-specific filters that transform the neutral representations into semantically rich ones, as needed by the participants. Conversely, the filters can translate semantically rich, domain-specific data into a neutral representation that can be accessed by other domain-specific filters. To validate the feasibility of the proposed model, we computationally implement the filter mechanism and apply it to a hypothetical test case. The result acknowledges that the filter mechanism can let the participants know ahead of time what will be the implications of their proposed actions, as seen from other participants' points of view.

  6. Automated Reconstruction of Historic Roof Structures from Point Clouds - Development and Examples

    NASA Astrophysics Data System (ADS)

    Pöchtrager, M.; Styhler-Aydın, G.; Döring-Williams, M.; Pfeifer, N.

    2017-08-01

    The analysis of historic roof constructions is an important task for planning the adaptive reuse of buildings or for maintenance and restoration issues. Current approaches to modeling roof constructions consist of several consecutive operations that need to be done manually or using semi-automatic routines. To increase efficiency and allow the focus to be on analysis rather than on data processing, a set of methods was developed for the fully automated analysis of the roof constructions, including integration of architectural and structural modeling. Terrestrial laser scanning permits high-detail surveying of large-scale structures within a short time. Whereas 3-D laser scan data consist of millions of single points on the object surface, we need a geometric description of structural elements in order to obtain a structural model consisting of beam axis and connections. Preliminary results showed that the developed methods work well for beams in flawless condition with a quadratic cross section and no bending. Deformations or damages such as cracks and cuts on the wooden beams can lead to incomplete representations in the model. Overall, a high degree of automation was achieved.

  7. Dynamically enriched topological orders in driven two-dimensional systems

    NASA Astrophysics Data System (ADS)

    Potter, Andrew C.; Morimoto, Takahiro

    2017-04-01

    Time-periodic driving of a quantum system can enable new dynamical topological phases of matter that could not exist in thermal equilibrium. We investigate two related classes of dynamical topological phenomena in 2D systems: Floquet symmetry-protected topological phases (FSPTs) and Floquet enriched topological orders (FETs). By constructing solvable lattice models for a complete set of 2D bosonic FSPT phases, we show that bosonic FSPTs can be understood as topological pumps which deposit loops of 1D SPT chains onto the boundary during each driving cycle, which protects a nontrivial edge state by dynamically tuning the edge to a self-dual point poised between the 1D SPT and trivial phases of the edge. By coupling these FSPT models to dynamical gauge fields, we construct solvable models of FET orders in which anyon excitations are dynamically transmuted into topologically distinct anyon types during each driving period. These bosonic FSPT and gauged FSPT models are classified by group cohomology methods. In addition, we also construct examples of "beyond cohomology" FET orders, which can be viewed as topological pumps of 1D topological chains formed of emergent anyonic quasiparticles.

  8. Validation study of the SCREENIVF: an instrument to screen women or men on risk for emotional maladjustment before the start of a fertility treatment.

    PubMed

    Ockhuijsen, Henrietta D L; van Smeden, Maarten; van den Hoogen, Agnes; Boivin, Jacky

    2017-06-01

    To examine construct and criterion validity of the Dutch SCREENIVF among women and men undergoing a fertility treatment. A prospective longitudinal study nested in a randomized controlled trial. University hospital. Couples, 468 women and 383 men, undergoing an IVF/intracytoplasmic sperm injection (ICSI) treatment in a fertility clinic, completed the SCREENIVF. Construct and criteria validity of the SCREENIVF. The comparative fit index and root mean square error of approximation for women and men show a good fit of the factor model. Across time, the sensitivity for Hospital Anxiety and Depression Scale subscale in women ranged from 61%-98%, specificity 53%-65%, predictive value of a positive test (PVP) 13%-56%, predictive value of a negative test (PVN) 70%-99%. The sensitivity scores for men ranged from 38%-100%, specificity 71%-75%, PVP 9%-27%, PVN 92%-100%. A prediction model revealed that for women 68.7% of the variance in the Hospital Anxiety and Depression Scale on time 1 and 42.5% at time 2 and 38.9% at time 3 was explained by the predictors, the sum score scales of the SCREENIVF. For men, 58.1% of the variance in the Hospital Anxiety and Depression Scale on time 1 and 46.5% at time 2 and 37.3% at time 3 was explained by the predictors, the sum score scales of the SCREENIVF. The SCREENIVF has good construct validity but the concurrent validity is better than the predictive validity. SCREENIVF will be most effectively used in fertility clinics at the start of treatment and should not be used as a predictive tool. Copyright © 2017 American Society for Reproductive Medicine. All rights reserved.

  9. Temporal data representation, normalization, extraction, and reasoning: A review from clinical domain

    PubMed Central

    Madkour, Mohcine; Benhaddou, Driss; Tao, Cui

    2016-01-01

    Background and Objective We live our lives by the calendar and the clock, but time is also an abstraction, even an illusion. The sense of time can be both domain-specific and complex, and is often left implicit, requiring significant domain knowledge to accurately recognize and harness. In the clinical domain, the momentum gained from recent advances in infrastructure and governance practices has enabled the collection of tremendous amount of data at each moment in time. Electronic Health Records (EHRs) have paved the way to making these data available for practitioners and researchers. However, temporal data representation, normalization, extraction and reasoning are very important in order to mine such massive data and therefore for constructing the clinical timeline. The objective of this work is to provide an overview of the problem of constructing a timeline at the clinical point of care and to summarize the state-of-the-art in processing temporal information of clinical narratives. Methods This review surveys the methods used in three important area: modeling and representing of time, Medical NLP methods for extracting time, and methods of time reasoning and processing. The review emphasis on the current existing gap between present methods and the semantic web technologies and catch up with the possible combinations. Results the main findings of this review is revealing the importance of time processing not only in constructing timelines and clinical decision support systems but also as a vital component of EHR data models and operations. Conclusions Extracting temporal information in clinical narratives is a challenging task. The inclusion of ontologies and semantic web will lead to better assessment of the annotation task and, together with medical NLP techniques, will help resolving granularity and co-reference resolution problems. PMID:27040831

  10. On the Origin and Evolution of Stellar Chromospheres, Coronae and Winds

    NASA Technical Reports Server (NTRS)

    Musielak, Z. E.

    1997-01-01

    The final report discusses work completed on proposals to construct state-of-the-art, theoretical, two-component, chromospheric models for single stars of different spectral types and different evolutionary status. We suggested to use these models to predict the level of the "basal flux", the observed range of variation of chromospheric activity for a given spectral type, and the decrease of this activity with stellar age. In addition, for red giants and supergiants, we also proposed to construct self-consistent, purely theoretical, chromosphere-wind models, and investigate the origin of "dividing lines" in the H-R diagram. In the report, we list the following six specific goals for the first and second year of the proposed research and then describe the completed work: (1) To calculate the acoustic and magnetic wave energy fluxes for stars located in different regions of the H-R diagram; (2) To investigate the transfer of this non-radiative energy through stellar photospheres and to estimate the amount of energy that reaches the chromosphere; (3) To identify major sources of radiative losses in stellar chromospheres and calculate the amount of emitted energy; (4) To use (1) through (3) to construct purely theoretical, two-component, chromospheric models based on the local energy balance. The models will be constructed for stars of different spectral types and different evolutionary status; (5) To explain theoretically the "basal flux", the location of stellar temperature minima and the observed range of chromospheric activity for stars of the same spectral type; and (6) To construct self-consistent, time-dependent stellar wind models based on the momentum deposition by finite amplitude Alfven waves.

  11. Extending the trans-contextual model in physical education and leisure-time contexts: examining the role of basic psychological need satisfaction.

    PubMed

    Barkoukis, Vassilis; Hagger, Martin S; Lambropoulos, George; Tsorbatzoudis, Haralambos

    2010-12-01

    The trans-contextual model (TCM) is an integrated model of motivation that aims to explain the processes by which agentic support for autonomous motivation in physical education promotes autonomous motivation and physical activity in a leisure-time context. It is proposed that perceived support for autonomous motivation in physical education is related to autonomous motivation in physical education and leisure-time contexts. Furthermore, relations between autonomous motivation and the immediate antecedents of intentions to engage in physical activity behaviour and actual behaviour are hypothesized. The purpose of the present study was to incorporate the constructs of basic psychological need satisfaction in the TCM to provide a more comprehensive explanation of motivation and demonstrate the robustness of the findings of previous tests of the model that have not incorporated these constructs. Students (N=274) from Greek secondary schools. Participants completed self-report measures of perceived autonomy support, autonomous motivation, and basic psychological need satisfaction in physical education. Follow-up measures of these variables were taken in a leisure-time context along with measures of attitudes, subjective norms, perceived behavioural control (PBC), and intentions from the theory of planned behaviour 1 week later. Self-reported physical activity behaviour was measured 4 weeks later. Results supported TCM hypotheses. Basic psychological need satisfaction variables uniquely predicted autonomous motivation in physical education and leisure time as well as the antecedents of intention, namely, attitudes, and PBC. The basic psychological need satisfaction variables also mediated the effects of perceived autonomy support on autonomous motivation in physical education. Findings support the TCM and provide further information of the mechanisms in the model and integrated theories of motivation in physical education and leisure time.

  12. Additive schemes for certain operator-differential equations

    NASA Astrophysics Data System (ADS)

    Vabishchevich, P. N.

    2010-12-01

    Unconditionally stable finite difference schemes for the time approximation of first-order operator-differential systems with self-adjoint operators are constructed. Such systems arise in many applied problems, for example, in connection with nonstationary problems for the system of Stokes (Navier-Stokes) equations. Stability conditions in the corresponding Hilbert spaces for two-level weighted operator-difference schemes are obtained. Additive (splitting) schemes are proposed that involve the solution of simple problems at each time step. The results are used to construct splitting schemes with respect to spatial variables for nonstationary Navier-Stokes equations for incompressible fluid. The capabilities of additive schemes are illustrated using a two-dimensional model problem as an example.

  13. Development of a proficiency-based virtual reality simulation training curriculum for laparoscopic appendicectomy.

    PubMed

    Sirimanna, Pramudith; Gladman, Marc A

    2017-10-01

    Proficiency-based virtual reality (VR) training curricula improve intraoperative performance, but have not been developed for laparoscopic appendicectomy (LA). This study aimed to develop an evidence-based training curriculum for LA. A total of 10 experienced (>50 LAs), eight intermediate (10-30 LAs) and 20 inexperienced (<10 LAs) operators performed guided and unguided LA tasks on a high-fidelity VR simulator using internationally relevant techniques. The ability to differentiate levels of experience (construct validity) was measured using simulator-derived metrics. Learning curves were analysed. Proficiency benchmarks were defined by the performance of the experienced group. Intermediate and experienced participants completed a questionnaire to evaluate the realism (face validity) and relevance (content validity). Of 18 surgeons, 16 (89%) considered the VR model to be visually realistic and 17 (95%) believed that it was representative of actual practice. All 'guided' modules demonstrated construct validity (P < 0.05), with learning curves that plateaued between sessions 6 and 9 (P < 0.01). When comparing inexperienced to intermediates to experienced, the 'unguided' LA module demonstrated construct validity for economy of motion (5.00 versus 7.17 versus 7.84, respectively; P < 0.01) and task time (864.5 s versus 477.2 s versus 352.1 s, respectively, P < 0.01). Construct validity was also confirmed for number of movements, path length and idle time. Validated modules were used for curriculum construction, with proficiency benchmarks used as performance goals. A VR LA model was realistic and representative of actual practice and was validated as a training and assessment tool. Consequently, the first evidence-based internationally applicable training curriculum for LA was constructed, which facilitates skill acquisition to proficiency. © 2017 Royal Australasian College of Surgeons.

  14. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  15. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  16. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  17. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  18. 40 CFR 60.3042 - How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... averages into the appropriate averaging times and units? 60.3042 Section 60.3042 Protection of Environment... Construction On or Before December 9, 2004 Model Rule-Monitoring § 60.3042 How do I convert my 1-hour arithmetic averages into the appropriate averaging times and units? (a) Use Equation 1 in § 60.3076 to...

  19. Simulation Model for Scenario Optimization of the Ready-Mix Concrete Delivery Problem

    NASA Astrophysics Data System (ADS)

    Galić, Mario; Kraus, Ivan

    2016-12-01

    This paper introduces a discrete simulation model for solving routing and network material flow problems in construction projects. Before the description of the model a detailed literature review is provided. The model is verified using a case study of solving the ready-mix concrete network flow and routing problem in metropolitan area in Croatia. Within this study real-time input parameters were taken into account. Simulation model is structured in Enterprise Dynamics simulation software and Microsoft Excel linked with Google Maps. The model is dynamic, easily managed and adjustable, but also provides good estimation for minimization of costs and realization time in solving discrete routing and material network flow problems.

  20. Modelling endurance and resumption times for repetitive one-hand pushing.

    PubMed

    Rose, Linda M; Beauchemin, Catherine A A; Neumann, W Patrick

    2018-07-01

    This study's objective was to develop models of endurance time (ET), as a function of load level (LL), and of resumption time (RT) after loading as a function of both LL and loading time (LT) for repeated loadings. Ten male participants with experience in construction work each performed 15 different one-handed repetaed pushing tasks at shoulder height with varied exerted force and duration. These data were used to create regression models predicting ET and RT. It is concluded that power law relationships are most appropriate to use when modelling ET and RT. While the data the equations are based on are limited regarding number of participants, gender, postures, magnitude and type of exerted force, the paper suggests how this kind of modelling can be used in job design and in further research. Practitioner Summary: Adequate muscular recovery during work-shifts is important to create sustainable jobs. This paper describes mathematical modelling and presents models for endurance times and resumption times (an aspect of recovery need), based on data from an empirical study. The models can be used to help manage fatigue levels in job design.

  1. A performance evaluation of ACO and SA TSP in a supply chain network

    NASA Astrophysics Data System (ADS)

    Rao, T. Srinivas

    2017-07-01

    Supply Chain management and E commerce business solutions are one of the prominent areas of active research. In our paper we have modelled a supply chain model which aggregates all the manufacturers requirement and the products are supplied to all the manufacturer through a common vehicle routing algorithm. An appropriate tsp has been constructed for all the manufacturers which determines the shortest route thru which the aggregated material can be supplied in the shortest possible time. In this paper we have solved the shortest route through constructing a Simulated annealing algorithm and Ant colony algorithm and their performance is evaluated.

  2. How Do Novice and Expert Learners Represent, Understand, and Discuss Geologic Time?

    NASA Astrophysics Data System (ADS)

    Layow, Erica Amanda

    This dissertation examined the representations novice and expert learners constructed for the geologic timescale. Learners engaged in a three-part activity. The purpose was to compare novice learners' representations to those of expert learners. This provided insight into the similarities and differences between their strategies for event ordering, assigning values and scale to the geologic timescale model, as well as their language and practices to complete the model. With a qualitative approach to data analysis informed by an expert-novice theoretical framework grounded in phenomenography, learner responses comprised the data analyzed. These data highlighted learners' metacognitive thoughts that might not otherwise be shared through lectures or laboratory activities. Learners' responses were analyzed using a discourse framework that positioned learners as knowers. Novice and expert learners both excelled at ordering and discussing events before the Phanerozoic, but were challenged with events during the Phanerozoic. Novice learners had difficulty assigning values to events and establishing a scale for their models. Expert learners expressed difficulty with determining a scale because of the size of the model, yet eventually used anchor points and unitized the model to establish a scale. Despite challenges constructing their models, novice learners spoke confidently using claims and few hedging phrases indicating their confidence in statements made. Experts used more hedges than novices, however the hedging comments were made about more complex conceptions. Using both phenomenographic and discourse analysis approaches for analysis foregrounded learners' discussions of how they perceived geologic time and their ways of knowing and doing. This research is intended to enhance the geoscience community's understanding of the ways novice and expert learners think and discuss conceptions of geologic time, including the events and values of time, and the strategies used to determine accuracy of scale. This knowledge will provide a base from which to support geoscience curriculum development at the university level, specifically to design activities that will not only engage and express learners' metacognitive scientific practices, but to encourage their construction of scientific identities and membership in the geoscience community.

  3. Evaluation of Autogenous Engineered Septal Cartilage Grafts in Rabbits- A Minimally Invasive Preclinical Model.

    PubMed

    Kushnaryov, Anton; Yamaguchi, Tomonoro; Briggs, Kristen K; Wong, Van W; Reuther, Marsha; Neuman, Monica; Lin, Victor; Sah, Robert L; Masuda, Koichi; Watson, Deborah

    2014-07-23

    Evaluate safety of autogenous engineered septal neocartilage grafts.Compare properties of implanted grafts versus in vitro controls. Prospective, basic science. Research laboratory. Constructs were fabricated from septal cartilage and serum harvested from adult rabbits and then cultured in vitro or implanted on the nasal dorsum as autogenous grafts for 30 or 60 days. Rabbits were monitored for local and systemic complications. Histological, biochemical and biomechanical properties of implanted and in vitro constructs were evaluated and compared. No systemic or serious local complications were observed. After 30 and 60 days, implanted constructs contained more DNA (p<0.01) and less sGAG per DNA (p<0.05) when compared with in vitro controls. Confined compressive aggregate moduli were also higher in implanted constructs when compared with in vitro controls (p<0.05) and increased with longer in vivo incubation time (p<0.01). Implanted constructs displayed resorption rates of 20-45 percent. Calcium deposition in implanted constructs was observed using alizarin red histochemistry and microtomographic analyses. Autogenous engineered septal cartilage grafts were well tolerated. As seen in experiments with athymic mice, implanted constructs accumulated more DNA and less sGAG when compared with in vitro controls. Confined compressive aggregate moduli were also higher in implanted constructs. Implanted constructs displayed resorption rates similar to previously published studies using autogenous implants of native cartilage. The basis for observed calcification in implanted constructs and its effect on long-term graft efficacy is unknown at this time and will be a focus of future studies.

  4. Development and validation of a measure of workplace climate for healthy weight maintenance.

    PubMed

    Sliter, Katherine A

    2013-07-01

    Due to the obesity epidemic, an increasing amount of research is being conducted to better understand the antecedents and consequences of excess employee weight. One construct often of interest to researchers in this area is organizational climate. Unfortunately, a viable measure of climate, as related to employee weight, does not exist. The purpose of this study was to remedy this by developing and validating a concise, psychometrically sound measure of climate for healthy weight. An item pool was developed based on surveys of full-time employees, and a sorting task was used to eliminate ambiguous items. Items were pilot tested by a sample of 338 full-time employees, and the item pool was reduced through item response theory (IRT) and reliability analyses. Finally, the retained 14 items, comprising 3 subscales, were completed by a sample of 360 full-time employees, representing 26 different organizations from across the United States. Multilevel modeling indicated that sufficient variance was explained by group membership to support aggregation, and confirmatory factor analysis (CFA) supported the hypothesized model of 3 subscale factors and an overall climate factor. Nine hypotheses specific to construct validation were tested. Scores on the new scale correlated significantly with individual-level reports of psychological constructs (e.g., health motivation, general leadership support for health) and physiological phenomena (e.g., body mass index [BMI], physical health problems) to which they should theoretically relate, supporting construct validity. Implications for the use of this scale in both applied and research settings are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  5. Research on Zheng Classification Fusing Pulse Parameters in Coronary Heart Disease

    PubMed Central

    Guo, Rui; Wang, Yi-Qin; Xu, Jin; Yan, Hai-Xia; Yan, Jian-Jun; Li, Fu-Feng; Xu, Zhao-Xia; Xu, Wen-Jie

    2013-01-01

    This study was conducted to illustrate that nonlinear dynamic variables of Traditional Chinese Medicine (TCM) pulse can improve the performances of TCM Zheng classification models. Pulse recordings of 334 coronary heart disease (CHD) patients and 117 normal subjects were collected in this study. Recurrence quantification analysis (RQA) was employed to acquire nonlinear dynamic variables of pulse. TCM Zheng models in CHD were constructed, and predictions using a novel multilabel learning algorithm based on different datasets were carried out. Datasets were designed as follows: dataset1, TCM inquiry information including inspection information; dataset2, time-domain variables of pulse and dataset1; dataset3, RQA variables of pulse and dataset1; and dataset4, major principal components of RQA variables and dataset1. The performances of the different models for Zheng differentiation were compared. The model for Zheng differentiation based on RQA variables integrated with inquiry information had the best performance, whereas that based only on inquiry had the worst performance. Meanwhile, the model based on time-domain variables of pulse integrated with inquiry fell between the above two. This result showed that RQA variables of pulse can be used to construct models of TCM Zheng and improve the performance of Zheng differentiation models. PMID:23737839

  6. On the feasibility of a transient dynamic design analysis

    NASA Astrophysics Data System (ADS)

    Cunniff, Patrick F.; Pohland, Robert D.

    1993-05-01

    The Dynamic Design Analysis Method has been used for the past 30 years as part of the Navy's efforts to shock-harden heavy shipboard equipment. This method which has been validated several times employs normal mode theory and design shock values. This report examines the degree of success that may be achieved by using simple equipment-vehicle models that produce time history responses which are equivalent to the responses that would be achieved using spectral design values employed by the Dynamic Design Analysis Method. These transient models are constructed by attaching the equipment's modal oscillators to the vehicle which is composed of rigid masses and elastic springs. Two methods have been developed for constructing these transient models. Each method generates the parameters of the vehicles so as to approximate the required damaging effects, such that the transient model is excited by an idealized impulse applied to the vehicle mass to which the equipment modal oscillators are attached. The first method called the Direct Modeling Method, is limited to equipment with at most three-degrees of freedom and the vehicle consists of a single lumped mass and spring. The Optimization Modeling Method, which is based on the simplex method for optimization, has been used successfully with a variety of vehicle models and equipment sizes.

  7. Choosing colors for map display icons using models of visual search.

    PubMed

    Shive, Joshua; Francis, Gregory

    2013-04-01

    We show how to choose colors for icons on maps to minimize search time using predictions of a model of visual search. The model analyzes digital images of a search target (an icon on a map) and a search display (the map containing the icon) and predicts search time as a function of target-distractor color distinctiveness and target eccentricity. We parameterized the model using data from a visual search task and performed a series of optimization tasks to test the model's ability to choose colors for icons to minimize search time across icons. Map display designs made by this procedure were tested experimentally. In a follow-up experiment, we examined the model's flexibility to assign colors in novel search situations. The model fits human performance, performs well on the optimization tasks, and can choose colors for icons on maps with novel stimuli to minimize search time without requiring additional model parameter fitting. Models of visual search can suggest color choices that produce search time reductions for display icons. Designers should consider constructing visual search models as a low-cost method of evaluating color assignments.

  8. Reduced-Order Modeling for Flutter/LCO Using Recurrent Artificial Neural Network

    NASA Technical Reports Server (NTRS)

    Yao, Weigang; Liou, Meng-Sing

    2012-01-01

    The present study demonstrates the efficacy of a recurrent artificial neural network to provide a high fidelity time-dependent nonlinear reduced-order model (ROM) for flutter/limit-cycle oscillation (LCO) modeling. An artificial neural network is a relatively straightforward nonlinear method for modeling an input-output relationship from a set of known data, for which we use the radial basis function (RBF) with its parameters determined through a training process. The resulting RBF neural network, however, is only static and is not yet adequate for an application to problems of dynamic nature. The recurrent neural network method [1] is applied to construct a reduced order model resulting from a series of high-fidelity time-dependent data of aero-elastic simulations. Once the RBF neural network ROM is constructed properly, an accurate approximate solution can be obtained at a fraction of the cost of a full-order computation. The method derived during the study has been validated for predicting nonlinear aerodynamic forces in transonic flow and is capable of accurate flutter/LCO simulations. The obtained results indicate that the present recurrent RBF neural network is accurate and efficient for nonlinear aero-elastic system analysis

  9. Characterizing system dynamics with a weighted and directed network constructed from time series data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Xiaoran, E-mail: sxr0806@gmail.com; School of Mathematics and Statistics, The University of Western Australia, Crawley WA 6009; Small, Michael, E-mail: michael.small@uwa.edu.au

    In this work, we propose a novel method to transform a time series into a weighted and directed network. For a given time series, we first generate a set of segments via a sliding window, and then use a doubly symbolic scheme to characterize every windowed segment by combining absolute amplitude information with an ordinal pattern characterization. Based on this construction, a network can be directly constructed from the given time series: segments corresponding to different symbol-pairs are mapped to network nodes and the temporal succession between nodes is represented by directed links. With this conversion, dynamics underlying the timemore » series has been encoded into the network structure. We illustrate the potential of our networks with a well-studied dynamical model as a benchmark example. Results show that network measures for characterizing global properties can detect the dynamical transitions in the underlying system. Moreover, we employ a random walk algorithm to sample loops in our networks, and find that time series with different dynamics exhibits distinct cycle structure. That is, the relative prevalence of loops with different lengths can be used to identify the underlying dynamics.« less

  10. Adaptive optimal stochastic state feedback control of resistive wall modes in tokamaks

    NASA Astrophysics Data System (ADS)

    Sun, Z.; Sen, A. K.; Longman, R. W.

    2006-01-01

    An adaptive optimal stochastic state feedback control is developed to stabilize the resistive wall mode (RWM) instability in tokamaks. The extended least-square method with exponential forgetting factor and covariance resetting is used to identify (experimentally determine) the time-varying stochastic system model. A Kalman filter is used to estimate the system states. The estimated system states are passed on to an optimal state feedback controller to construct control inputs. The Kalman filter and the optimal state feedback controller are periodically redesigned online based on the identified system model. This adaptive controller can stabilize the time-dependent RWM in a slowly evolving tokamak discharge. This is accomplished within a time delay of roughly four times the inverse of the growth rate for the time-invariant model used.

  11. Adaptive Optimal Stochastic State Feedback Control of Resistive Wall Modes in Tokamaks

    NASA Astrophysics Data System (ADS)

    Sun, Z.; Sen, A. K.; Longman, R. W.

    2007-06-01

    An adaptive optimal stochastic state feedback control is developed to stabilize the resistive wall mode (RWM) instability in tokamaks. The extended least square method with exponential forgetting factor and covariance resetting is used to identify the time-varying stochastic system model. A Kalman filter is used to estimate the system states. The estimated system states are passed on to an optimal state feedback controller to construct control inputs. The Kalman filter and the optimal state feedback controller are periodically redesigned online based on the identified system model. This adaptive controller can stabilize the time dependent RWM in a slowly evolving tokamak discharge. This is accomplished within a time delay of roughly four times the inverse of the growth rate for the time-invariant model used.

  12. Predictive Effects of Good Self-Control and Poor Regulation on Alcohol-Related Outcomes: Do Protective Behavioral Strategies Mediate?

    PubMed Central

    Pearson, Matthew R.; Kite, Benjamin A.; Henson, James M.

    2016-01-01

    In the present study, we examined whether use of protective behavioral strategies mediated the relationship between self-control constructs and alcohol-related outcomes. According to the two-mode model of self-control, good self-control (planfulness; measured with Future Time Perspective, Problem Solving, and Self-Reinforcement) and poor regulation (impulsivity; measured with Present Time Perspective, Poor Delay of Gratification, Distractibility) are theorized to be relatively independent constructs rather than opposite ends of a single continuum. The analytic sample consisted of 278 college student drinkers (68% women) who responded to a battery of surveys at a single time point. Using a structural equation model based on the two-mode model of self-control, we found that good self-control predicted increased use of three types of protective behavioral strategies (Manner of Drinking, Limiting/Stopping Drinking, and Serious Harm Reduction). Poor regulation was unrelated to use of protective behavioral strategies, but had direct effects on alcohol use and alcohol problems. Further, protective behavioral strategies mediated the relationship between good self-control and alcohol use. The clinical implications of these findings are discussed. PMID:22663345

  13. Mathematical Modeling and Dynamic Simulation of Metabolic Reaction Systems Using Metabolome Time Series Data.

    PubMed

    Sriyudthsak, Kansuporn; Shiraishi, Fumihide; Hirai, Masami Yokota

    2016-01-01

    The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although, hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.

  14. Factor structure and longitudinal measurement invariance of the demand control support model: an evidence from the Swedish Longitudinal Occupational Survey of Health (SLOSH).

    PubMed

    Chungkham, Holendro Singh; Ingre, Michael; Karasek, Robert; Westerlund, Hugo; Theorell, Töres

    2013-01-01

    To examine the factor structure and to evaluate the longitudinal measurement invariance of the demand-control-support questionnaire (DCSQ), using the Swedish Longitudinal Occupational Survey of Health (SLOSH). A confirmatory factor analysis (CFA) and multi-group confirmatory factor analysis (MGCFA) models within the framework of structural equation modeling (SEM) have been used to examine the factor structure and invariance across time. Four factors: psychological demand, skill discretion, decision authority and social support, were confirmed by CFA at baseline, with the best fit obtained by removing the item repetitive work of skill discretion. A measurement error correlation (0.42) between work fast and work intensively for psychological demands was also detected. Acceptable composite reliability measures were obtained except for skill discretion (0.68). The invariance of the same factor structure was established, but caution in comparing mean levels of factors over time is warranted as lack of intercept invariance was evident. However, partial intercept invariance was established for work intensively. Our findings indicate that skill discretion and decision authority represent two distinct constructs in the retained model. However removing the item repetitive work along with either work fast or work intensively would improve model fit. Care should also be taken while making comparisons in the constructs across time. Further research should investigate invariance across occupations or socio-economic classes.

  15. Factor Structure and Longitudinal Measurement Invariance of the Demand Control Support Model: An Evidence from the Swedish Longitudinal Occupational Survey of Health (SLOSH)

    PubMed Central

    Chungkham, Holendro Singh; Ingre, Michael; Karasek, Robert; Westerlund, Hugo; Theorell, Töres

    2013-01-01

    Objectives To examine the factor structure and to evaluate the longitudinal measurement invariance of the demand-control-support questionnaire (DCSQ), using the Swedish Longitudinal Occupational Survey of Health (SLOSH). Methods A confirmatory factor analysis (CFA) and multi-group confirmatory factor analysis (MGCFA) models within the framework of structural equation modeling (SEM) have been used to examine the factor structure and invariance across time. Results Four factors: psychological demand, skill discretion, decision authority and social support, were confirmed by CFA at baseline, with the best fit obtained by removing the item repetitive work of skill discretion. A measurement error correlation (0.42) between work fast and work intensively for psychological demands was also detected. Acceptable composite reliability measures were obtained except for skill discretion (0.68). The invariance of the same factor structure was established, but caution in comparing mean levels of factors over time is warranted as lack of intercept invariance was evident. However, partial intercept invariance was established for work intensively. Conclusion Our findings indicate that skill discretion and decision authority represent two distinct constructs in the retained model. However removing the item repetitive work along with either work fast or work intensively would improve model fit. Care should also be taken while making comparisons in the constructs across time. Further research should investigate invariance across occupations or socio-economic classes. PMID:23950957

  16. A Stochastic Differential Equation Model for the Spread of HIV amongst People Who Inject Drugs.

    PubMed

    Liang, Yanfeng; Greenhalgh, David; Mao, Xuerong

    2016-01-01

    We introduce stochasticity into the deterministic differential equation model for the spread of HIV amongst people who inject drugs (PWIDs) studied by Greenhalgh and Hay (1997). This was based on the original model constructed by Kaplan (1989) which analyses the behaviour of HIV/AIDS amongst a population of PWIDs. We derive a stochastic differential equation (SDE) for the fraction of PWIDs who are infected with HIV at time. The stochasticity is introduced using the well-known standard technique of parameter perturbation. We first prove that the resulting SDE for the fraction of infected PWIDs has a unique solution in (0, 1) provided that some infected PWIDs are initially present and next construct the conditions required for extinction and persistence. Furthermore, we show that there exists a stationary distribution for the persistence case. Simulations using realistic parameter values are then constructed to illustrate and support our theoretical results. Our results provide new insight into the spread of HIV amongst PWIDs. The results show that the introduction of stochastic noise into a model for the spread of HIV amongst PWIDs can cause the disease to die out in scenarios where deterministic models predict disease persistence.

  17. Simscape Modeling of a Custom Closed-Volume Tank

    NASA Technical Reports Server (NTRS)

    Fischer, Nathaniel P.

    2015-01-01

    The library for Mathworks Simscape does not currently contain a model for a closed volume fluid tank where the ullage pressure is variable. In order to model a closed-volume variable ullage pressure tank, it was necessary to consider at least two separate cases: a vertical cylinder, and a sphere. Using library components, it was possible to construct a rough model for the cylindrical tank. It was not possible to construct a model for a spherical tank, using library components, due to the variable area. It was decided that, for these cases, it would be preferable to create a custom library component to represent each case, using the Simscape language. Once completed, the components were added to models, where filling and draining the tanks could be simulated. When the models were performing as expected, it was necessary to generate code from the models and run them in Trick (a real-time simulation program). The data output from Trick was then compared to the output from Simscape and found to be within acceptable limits.

  18. 40 CFR 60.1605 - What if I do not meet an increment of progress?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule... increment of progress, you must submit a notification to the Administrator postmarked within 10 business...

  19. 40 CFR Table 1 to Subpart Ffff of... - Model Rule-Compliance Schedule

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before December 9... devices so that, when the incineration unit is brought on line, all process changes and air pollution...

  20. 40 CFR Table 1 to Subpart Ffff of... - Model Rule-Compliance Schedule

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before December 9... devices so that, when the incineration unit is brought on line, all process changes and air pollution...

  1. Parametric geometric model and hydrodynamic shape optimization of a flying-wing structure underwater glider

    NASA Astrophysics Data System (ADS)

    Wang, Zhen-yu; Yu, Jian-cheng; Zhang, Ai-qun; Wang, Ya-xing; Zhao, Wen-tao

    2017-12-01

    Combining high precision numerical analysis methods with optimization algorithms to make a systematic exploration of a design space has become an important topic in the modern design methods. During the design process of an underwater glider's flying-wing structure, a surrogate model is introduced to decrease the computation time for a high precision analysis. By these means, the contradiction between precision and efficiency is solved effectively. Based on the parametric geometry modeling, mesh generation and computational fluid dynamics analysis, a surrogate model is constructed by adopting the design of experiment (DOE) theory to solve the multi-objects design optimization problem of the underwater glider. The procedure of a surrogate model construction is presented, and the Gaussian kernel function is specifically discussed. The Particle Swarm Optimization (PSO) algorithm is applied to hydrodynamic design optimization. The hydrodynamic performance of the optimized flying-wing structure underwater glider increases by 9.1%.

  2. Toxicity testing of four silver nanoparticle-coated dental castings in 3-D LO2 cell cultures.

    PubMed

    Zhao, Yi-Ying; Chu, Qiang; Shi, Xu-Er; Zheng, Xiao-Dong; Shen, Xiao-Ting; Zhang, Yan-Zhen

    To address the controversial issue of the toxicity of dental alloys and silver nanoparticles in medical applications, an in vivo-like LO2 3-D model was constructed within polyvinylidene fluoride hollow fiber materials to mimic the microenvironment of liver tissue. The use of microscopy methods and the measurement of liver-specific functions optimized the model for best cell performances and also proved the superiority of the 3-D LO2 model when compared with the traditional monolayer model. Toxicity tests were conducted using the newly constructed model, finding that four dental castings coated with silver nanoparticles were toxic to human hepatocytes after cell viability assays. In general, the toxicity of both the castings and the coated silver nanoparticles aggravated as time increased, yet the nanoparticles attenuated the general toxicity by preventing metal ion release, especially at high concentrations.

  3. Wall Shear Stress Distribution in a Patient-Specific Cerebral Aneurysm Model using Reduced Order Modeling

    NASA Astrophysics Data System (ADS)

    Han, Suyue; Chang, Gary Han; Schirmer, Clemens; Modarres-Sadeghi, Yahya

    2016-11-01

    We construct a reduced-order model (ROM) to study the Wall Shear Stress (WSS) distributions in image-based patient-specific aneurysms models. The magnitude of WSS has been shown to be a critical factor in growth and rupture of human aneurysms. We start the process by running a training case using Computational Fluid Dynamics (CFD) simulation with time-varying flow parameters, such that these parameters cover the range of parameters of interest. The method of snapshot Proper Orthogonal Decomposition (POD) is utilized to construct the reduced-order bases using the training CFD simulation. The resulting ROM enables us to study the flow patterns and the WSS distributions over a range of system parameters computationally very efficiently with a relatively small number of modes. This enables comprehensive analysis of the model system across a range of physiological conditions without the need to re-compute the simulation for small changes in the system parameters.

  4. Global exponential stability of positive periodic solution of the n-species impulsive Gilpin-Ayala competition model with discrete and distributed time delays.

    PubMed

    Zhao, Kaihong

    2018-12-01

    In this paper, we study the n-species impulsive Gilpin-Ayala competition model with discrete and distributed time delays. The existence of positive periodic solution is proved by employing the fixed point theorem on cones. By constructing appropriate Lyapunov functional, we also obtain the global exponential stability of the positive periodic solution of this system. As an application, an interesting example is provided to illustrate the validity of our main results.

  5. Numerical Study of the Plasticity-Induced Stabilization Effect on Martensitic Transformations in Shape Memory Alloys

    NASA Astrophysics Data System (ADS)

    Junker, Philipp; Hempel, Philipp

    2017-12-01

    It is well known that plastic deformations in shape memory alloys stabilize the martensitic phase. Furthermore, the knowledge concerning the plastic state is crucial for a reliable sustainability analysis of construction parts. Numerical simulations serve as a tool for the realistic investigation of the complex interactions between phase transformations and plastic deformations. To account also for irreversible deformations, we expand an energy-based material model by including a non-linear isotropic hardening plasticity model. An implementation of this material model into commercial finite element programs, e.g., Abaqus, offers the opportunity to analyze entire structural components at low costs and fast computation times. Along with the theoretical derivation and expansion of the model, several simulation results for various boundary value problems are presented and interpreted for improved construction designing.

  6. Evaluation of the hydrological flow paths in a gravel bed filter modeling a horizontal subsurface flow wetland by using a multi-tracer experiment.

    PubMed

    Birkigt, Jan; Stumpp, Christine; Małoszewski, Piotr; Nijenhuis, Ivonne

    2018-04-15

    In recent years, constructed wetland systems have become into focus as means of cost-efficient organic contaminant management. Wetland systems provide a highly reactive environment in which several removal pathways of organic chemicals may be present at the same time; however, specific elimination processes and hydraulic conditions are usually separately investigated and thus not fully understood. The flow system in a three dimensional pilot-scale horizontal subsurface constructed wetland was investigated applying a multi-tracer test combined with a mathematical model to evaluate the flow and transport processes. The results indicate the existence of a multiple flow system with two distinct flow paths through the gravel bed and a preferential flow at the bottom transporting 68% of tracer mass resulting from the inflow design of the model wetland system. There the removal of main contaminant chlorobenzene was up to 52% based on different calculation approaches. Determined retention times in the range of 22d to 32.5d the wetland has a heterogeneous flow pattern. Differences between simulated and measured tracer concentrations in the upper sediment indicate diffusion dominated processes due to stagnant water zones. The tracer study combining experimental evaluation with mathematical modeling demonstrated the complexity of flow and transport processes in the constructed wetlands which need to be taken into account during interpretation of the determining attenuation processes. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Approaches to emergent spacetime in gauge/gravity duality

    NASA Astrophysics Data System (ADS)

    Sully, James Kenneth

    2013-08-01

    In this thesis we explore approaches to emergent local spacetime in gauge/gravity duality. We first conjecture that every CFT with a large-N type limit and a parametrically large gap in the spectrum of single-trace operators has a local bulk dual. We defend this conjecture by counting consistent solutions to the four-point function in simple scalar models and matching to the number of local interaction terms in the bulk. Next, we proceed to explicitly construct local bulk operators using smearing functions. We argue that this construction allows one to probe inside black hole horizons for only short times. We then suggest that the failure to construct bulk operators inside a black hole at late times is indicative of a break-down of local effective field theory at the black hole horizon. We argue that the postulates of black hole complementarity are inconsistent and cannot be realized within gauge/gravity duality. We argue that the most conservative solution is a firewall at the black hole horizon and we critically explore alternative resolutions. We then examine the CGHS model of two-dimensional gravity to look for dynamical formation of firewalls. We find that the CGHS model does not exhibit firewalls, but rather contains long-lived remnants. We argue that, while this is consistent for the CGHS model, it cannot be so in higher-dimensional theories of gravity. Lastly, we turn to F-theory, and detail local and global obstructions to writing elliptic fibrations in Tate form. We determine more general possible forms.

  8. Higher-dimensional generalizations of the Watanabe–Strogatz transform for vector models of synchronization

    NASA Astrophysics Data System (ADS)

    Lohe, M. A.

    2018-06-01

    We generalize the Watanabe–Strogatz (WS) transform, which acts on the Kuramoto model in d  =  2 dimensions, to a higher-dimensional vector transform which operates on vector oscillator models of synchronization in any dimension , for the case of identical frequency matrices. These models have conserved quantities constructed from the cross ratios of inner products of the vector variables, which are invariant under the vector transform, and have trajectories which lie on the unit sphere S d‑1. Application of the vector transform leads to a partial integration of the equations of motion, leaving independent equations to be solved, for any number of nodes N. We discuss properties of complete synchronization and use the reduced equations to derive a stability condition for completely synchronized trajectories on S d‑1. We further generalize the vector transform to a mapping which acts in and in particular preserves the unit ball , and leaves invariant the cross ratios constructed from inner products of vectors in . This mapping can be used to partially integrate a system of vector oscillators with trajectories in , and for d  =  2 leads to an extension of the Kuramoto system to a system of oscillators with time-dependent amplitudes and trajectories in the unit disk. We find an inequivalent generalization of the Möbius map which also preserves but leaves invariant a different set of cross ratios, this time constructed from the vector norms. This leads to a different extension of the Kuramoto model with trajectories in the complex plane that can be partially integrated by means of fractional linear transformations.

  9. Tidal-flow, circulation, and flushing changes caused by dredge and fill in Hillsborough Bay, Florida

    USGS Publications Warehouse

    Goodwin, Carl R.

    1991-01-01

    Hillsborough Bay, Florida, underwent extensive physical changes between 1880 and 1972 because of the construction of islands, channels, and shoreline fills. These changes resulted in a progressive reduction in the quantity of tidal water that enters and leaves the bay. Dredging and filling also changed the magnitude and direction of tidal flow in most of the bay. A two-dimensional, finite-difference hydrodynamic model was used to simulate flood, ebb, and residual water transport for physical conditions in Hillsborough Bay and the northeastern part of Middle Tampa Bay during 1880, 1972, and 1985. The calibrated and verified model was used to evaluate cumulative water-transport changes resulting from construction in the study area between 1880 and 1972. The model also was used to evaluate water-transport changes as a result of a major Federal dredging project completed in 1985. The model indicates that transport changes resulting from the Federal dredging project are much less areally extensive than the corresponding transport changes resulting from construction between 1880 and 1972. Dredging-caused changes of more than 50 percent in flood and ebb water transport were computed to occur over only about 8 square miles of the 65-square-mile study area between 1972 and 1985. Model results indicate that construction between 1880 and 1972 caused changes of similar magnitude over about 23 square miles. Dredging-caused changes of more than 50 percent in residual water transport were computed to occur over only 17 square miles between 1972 and 1985. Between 1880 and 1972, changes of similar magnitude were computed to occur over an area of 45 square miles. Model results also reveal historical tide-induced circulation patterns. The patterns consist of a series of about 8 interconnected circulatory features in 1880 and as many as 15 in 1985. Dredging- and construction-caused changes in number, size, position, shape, and intensity of the circulatory features increase tide-induced circulation throughout the bay. Circulation patterns for 1880, 1972, and 1985 levels of development differ in many details, but all exhibit residual landward flow of water in the deep, central part of the bay and residual seaward flow in the shallows along the bay margins. This general residual flow pattern is confirmed by both computed transport of a hypothetical constituent and long-term salinity observations in Hillsborough Bay. The concept has been used to estimate the average time it takes a particle to move from the head to the mouth of the bay. The mean transit time was computed to be 58 days in 1880 and 29 days in 1972 and 1985. This increase in circulation and decrease in transit time since 1880 is estimated to have caused an increase in average salinity of Hillsborough Bay of about 2 parts per thousand. Dredge and fill construction is concluded to have significantly increased circulation and flushing between 1880 and 1972. Little circulation or flushing change is attributed to dredging activity since 1972.

  10. Regional variations in seismic boundaries

    NASA Astrophysics Data System (ADS)

    Shumlyanska, Ludmila

    2010-05-01

    Dividing of the Earth into zones in the frame one-dimensional velocity model was proposed Jeffreys and Gutenberg is the first half of XX century. They recovered the following zones: A - the crust; B - zone in the depth interval 33-413 km, C - zone 413-984 km, D - zone 984-2898 km, E - 2898-4982 km, F - 4982-5121 km, G - 5121-6371 km (centre of the Earth). These zones differ in their seismic properties. Later, zone D was divided to the areas D' (984-2700 km) and D" (2700-2900 km). At present, this scheme is significantly modified and only the layer D" is in wide use. The more seismological studies are carried out, the more seismic boundaries appear. Boundaries at 410, 520, 670, and 2900 km, at which increase in the velocity of the seismic waves is particularly noticeable are considered as having global significance. Moreover, there are indications of the existence of geophysical boundaries at 800, 1200-1300, 1700, 1900-2000 km. Using 3D P-velocity model of the mantle based on Taylor approximation method for solving of the inverse kinematics multi-dimensional seismic task we have obtained seismic boundaries for the area covering 20-55° E × 40-55° N. Data on the time of first arrivals of P waves from earthquakes and nuclear explosions recorded at ISC stations during 1964-2002 were used as input to construct a 3-D model. The model has two a priori limits: 1) the velocity is a continuous function of spatial coordinates, 2) the function v(r)/r where r is a radius in the spherical coordinate system r, φ, λ decreases with depth. The first limitation is forced since velocity leaps can not be sustainably restored from the times of first arrival; the second one follows from the nature of the observed data. Results presented as horizontal sections of the actual velocity every 25 km in the depth interval 850-2850 km, and as the longitudinal and latitudinal sections of the discrepancy on the 1-D reference model, obtained as a result of solving of the inversion task at 1° in the same depth interval [1, 2]. A general approach to the solving of the seismic tomography task by the method of Taylor's approximation is as follows: construction of a generalized field of mid-point of arrival times of waves at the observation station; construction of mid-points travel-time curves, i.e. cross-sections of the generalized field of mid-point of the arrival times of waves; inversion of travel time of the mid-point curve into speed curve. Due to the imposed limitations there are no abrupt velocity leaps in the model in use. First derivatives of the velocity for each curve were calculated points of local extreme were identified in order to determine the seismic boundaries. Maps of depths of occurrences of seismic boundaries at about 410 km, 670 km, 1700 km, and 2800 km were constructed. In general there is a deviation from generally accepted values beneath regions with different geodynamic regimes. There is a correlation of the 410 km and 670 km boundaries behaviour with the observed heat flow anomalies and gravitational field. [1] V.Geyko, T. Tsvetkova, L. Shymlanskaya, I. Bugaienko, L. Zaets Regional 3-D velocity model of the mantle of Sarmatia (south-west of the East European Platform). Geophysical Journal, 2005, iss. 6, P. 927-939. (In Russian) [2] V. Geyko, L. Shymlanskaya, T. Tsvetkova, I.Bugaenko, L.Zaets Three-dimensional model of the upper mantle of Ukraine constructed from the times of P waves arrival. Geophysical Journal, 2006, iss. 1, P. 3-16. (In Russian)

  11. Gender Differences in College Leisure Time Physical Activity: Application of the Theory of Planned Behavior and Integrated Behavioral Model

    ERIC Educational Resources Information Center

    Beville, Jill M.; Umstattd Meyer, M. Renée; Usdan, Stuart L.; Turner, Lori W.; Jackson, John C.; Lian, Brad E.

    2014-01-01

    Objective: National data consistently report that males participate in leisure time physical activity (LTPA) at higher rates than females. This study expanded previous research to examine gender differences in LTPA of college students using the theory of planned behavior (TPB) by including 2 additional constructs, descriptive norm and…

  12. One Year Program to Train Developers in Public Education Systems. Final Report.

    ERIC Educational Resources Information Center

    New York Univ., NY. Inst. of Afro-American Affairs.

    The purpose of this program to train developers in public education systems was to construct and test a viable model that would fulfill its training goals in one year and which could also be replicated under similar conditions by comparable institutions. The model involved a part-time program which provided theoretical and experiential training…

  13. Skin-Friction Measurements at Subsonic and Transonic Mach Numbers with Embedded-Wire Gages

    DTIC Science & Technology

    1981-01-01

    Model ................................... 17 9. Boundary-Layer Rake Installation on EBOR Model...boundary-layer total pressure rake eliminates this bulky mechanism and the long data acquisition time, but it introduces interferences which affect the...its construction. Further, boundary-layer rakes are restricted to measurements in thick boundary layers. Surface pressure probes such as Stanton tubes

  14. Longitudinal Construct Validity of Brief Symptom Inventory Subscales in Schizophrenia

    ERIC Educational Resources Information Center

    Long, Jeffrey D.; Harring, Jeffrey R.; Brekke, John S.; Test, Mary Ann; Greenberg, Jan

    2007-01-01

    Longitudinal validity of Brief Symptom Inventory subscales was examined in a sample (N = 318) with schizophrenia-related illness measured at baseline and every 6 months for 3 years. Nonlinear factor analysis of items was used to test graded response models (GRMs) for subscales in isolation. The models varied in their within-time and between-times…

  15. Simulating initial attack with two fire containment models

    Treesearch

    Romain M. Mees

    1985-01-01

    Given a variable rate of fireline construction and an elliptical fire growth model, two methods for estimating the required number of resources, time to containment, and the resulting fire area were compared. Five examples illustrate some of the computational differences between the simple and the complex methods. The equations for the two methods can be used and...

  16. One-dimensional velocity model of the Middle Kura Depresion from local earthquakes data of Azerbaijan

    NASA Astrophysics Data System (ADS)

    Yetirmishli, G. C.; Kazimova, S. E.; Kazimov, I. E.

    2011-09-01

    We present the method for determining the velocity model of the Earth's crust and the parameters of earthquakes in the Middle Kura Depression from the data of network telemetry in Azerbaijan. Application of this method allowed us to recalculate the main parameters of the hypocenters of the earthquake, to compute the corrections to the arrival times of P and S waves at the observation station, and to significantly improve the accuracy in determining the coordinates of the earthquakes. The model was constructed using the VELEST program, which calculates one-dimensional minimal velocity models from the travel times of seismic waves.

  17. Urban dust in the Guanzhong basin of China, part II: A case study of urban dust pollution using the WRF-Dust model.

    PubMed

    Li, Nan; Long, Xin; Tie, Xuexi; Cao, Junji; Huang, Rujin; Zhang, Rong; Feng, Tian; Liu, Suixin; Li, Guohui

    2016-01-15

    We developed a regional dust dynamical model (WRF-Dust) to simulate surface dust concentrations in the Guanzhong (GZ) basin of China during two typical dust cases (19th Aug. and 26th Nov., 2013), and compared model results with the surface measurements at 17 urban and rural sites. The important improvement of the model is to employ multiple high-resolution (0.5-500 m) remote sensing data to construct dust sources. The new data include the geographic information of constructions, croplands, and barrens over the GZ basin in summer and winter of 2013. For the first time, detailed construction dust emissions have been introduced in a regional dust model in large cities of China. Our results show that by including the detailed dust sources, model performance at simulating dust pollutions in the GZ basin is significantly improved. For example, the simulated dust concentration average for the 17 sites increases from 28 μg m(-3) to 59 μg m(-3), closing to the measured concentration of 66 μg m(-3). In addition, the correlation coefficient (r) between the calculated and measured dust concentrations is also improved from 0.17 to 0.57, suggesting that our model better presents the spatial variation. Further analysis shows that urban construction activities are the crucial source in controlling urban dust pollutions. It should be considered by policy makers for mitigating particulate air pollution in many Chinese cities. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiangjiang; Li, Weixuan; Lin, Guang

    In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters.more » To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen–Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.« less

  19. Factors Influencing Implementation of OHSAS 18001 in Indian Construction Organizations: Interpretive Structural Modeling Approach

    PubMed Central

    Rajaprasad, Sunku Venkata Siva; Chalapathi, Pasupulati Venkata

    2015-01-01

    Background Construction activity has made considerable breakthroughs in the past two decades on the back of increases in development activities, government policies, and public demand. At the same time, occupational health and safety issues have become a major concern to construction organizations. The unsatisfactory safety performance of the construction industry has always been highlighted since the safety management system is neglected area and not implemented systematically in Indian construction organizations. Due to a lack of enforcement of the applicable legislation, most of the construction organizations are forced to opt for the implementation of Occupational Health Safety Assessment Series (OHSAS) 18001 to improve safety performance. Methods In order to better understand factors influencing the implementation of OHSAS 18001, an interpretive structural modeling approach has been applied and the factors have been classified using matrice d'impacts croises-multiplication appliqué a un classement (MICMAC) analysis. The study proposes the underlying theoretical framework to identify factors and to help management of Indian construction organizations to understand the interaction among factors influencing in implementation of OHSAS 18001. Results Safety culture, continual improvement, morale of employees, and safety training have been identified as dependent variables. Safety performance, sustainable construction, and conducive working environment have been identified as linkage variables. Management commitment and safety policy have been identified as the driver variables. Conclusion Management commitment has the maximum driving power and the most influential factor is safety policy, which states clearly the commitment of top management towards occupational safety and health. PMID:26929828

  20. On supervised graph Laplacian embedding CA model & kernel construction and its application

    NASA Astrophysics Data System (ADS)

    Zeng, Junwei; Qian, Yongsheng; Wang, Min; Yang, Yongzhong

    2017-01-01

    There are many methods to construct kernel with given data attribute information. Gaussian radial basis function (RBF) kernel is one of the most popular ways to construct a kernel. The key observation is that in real-world data, besides the data attribute information, data label information also exists, which indicates the data class. In order to make use of both data attribute information and data label information, in this work, we propose a supervised kernel construction method. Supervised information from training data is integrated into standard kernel construction process to improve the discriminative property of resulting kernel. A supervised Laplacian embedding cellular automaton model is another key application developed for two-lane heterogeneous traffic flow with the safe distance and large-scale truck. Based on the properties of traffic flow in China, we re-calibrate the cell length, velocity, random slowing mechanism and lane-change conditions and use simulation tests to study the relationships among the speed, density and flux. The numerical results show that the large-scale trucks will have great effects on the traffic flow, which are relevant to the proportion of the large-scale trucks, random slowing rate and the times of the lane space change.

  1. ISSLS prize winner: integrating theoretical and experimental methods for functional tissue engineering of the annulus fibrosus.

    PubMed

    Nerurkar, Nandan L; Mauck, Robert L; Elliott, Dawn M

    2008-12-01

    Integrating theoretical and experimental approaches for annulus fibrosus (AF) functional tissue engineering. Apply a hyperelastic constitutive model to characterize the evolution of engineered AF via scalar model parameters. Validate the model and predict the response of engineered constructs to physiologic loading scenarios. There is need for a tissue engineered replacement for degenerate AF. When evaluating engineered replacements for load-bearing tissues, it is necessary to evaluate mechanical function with respect to the native tissue, including nonlinearity and anisotropy. Aligned nanofibrous poly-epsilon-caprolactone scaffolds with prescribed fiber angles were seeded with bovine AF cells and analyzed over 8 weeks, using experimental (mechanical testing, biochemistry, histology) and theoretical methods (a hyperelastic fiber-reinforced constitutive model). The linear region modulus for phi = 0 degrees constructs increased by approximately 25 MPa, and for phi = 90 degrees by approximately 2 MPa from 1 day to 8 weeks in culture. Infiltration and proliferation of AF cells into the scaffold and abundant deposition of s-GAG and aligned collagen was observed. The constitutive model had excellent fits to experimental data to yield matrix and fiber parameters that increased with time in culture. Correlations were observed between biochemical measures and model parameters. The model was successfully validated and used to simulate time-varying responses of engineered AF under shear and biaxial loading. AF cells seeded on nanofibrous scaffolds elaborated an organized, anisotropic AF-like extracellular matrix, resulting in improved mechanical properties. A hyperelastic fiber-reinforced constitutive model characterized the functional evolution of engineered AF constructs, and was used to simulate physiologically relevant loading configurations. Model predictions demonstrated that fibers resist shear even when the shearing direction does not coincide with the fiber direction. Further, the model suggested that the native AF fiber architecture is uniquely designed to support shear stresses encountered under multiple loading configurations.

  2. Negating Tissue Contracture Improves Volume Maintenance and Longevity of In Vivo Engineered Tissues.

    PubMed

    Lytle, Ian F; Kozlow, Jeffrey H; Zhang, Wen X; Buffington, Deborah A; Humes, H David; Brown, David L

    2015-10-01

    Engineering large, complex tissues in vivo requires robust vascularization to optimize survival, growth, and function. Previously, the authors used a "chamber" model that promotes intense angiogenesis in vivo as a platform for functional three-dimensional muscle and renal engineering. A silicone membrane used to define the structure and to contain the constructs is successful in the short term. However, over time, generated tissues contract and decrease in size in a manner similar to capsular contracture seen around many commonly used surgical implants. The authors hypothesized that modification of the chamber structure or internal surface would promote tissue adherence and maintain construct volume. Three chamber configurations were tested against volume maintenance. Previously studied, smooth silicone surfaces were compared to chambers modified for improved tissue adherence, with multiple transmembrane perforations or lined with a commercially available textured surface. Tissues were allowed to mature long term in a rat model, before analysis. On explantation, average tissue masses were 49, 102, and 122 mg; average volumes were 74, 158 and 176 μl; and average cross-sectional areas were 1.6, 6.7, and 8.7 mm for the smooth, perforated, and textured groups, respectively. Both perforated and textured designs demonstrated significantly greater measures than the smooth-surfaced constructs in all respects. By modifying the design of chambers supporting vascularized, three-dimensional, in vivo tissue engineering constructs, generated tissue mass, volume, and area can be maintained over a long time course. Successful progress in the scale-up of construct size should follow, leading to improved potential for development of increasingly complex engineered tissues.

  3. Measurement Invariance Conventions and Reporting: The State of the Art and Future Directions for Psychological Research

    PubMed Central

    Putnick, Diane L.; Bornstein, Marc H.

    2016-01-01

    Measurement invariance assesses the psychometric equivalence of a construct across groups or across time. Measurement noninvariance suggests that a construct has a different structure or meaning to different groups or on different measurement occasions in the same group, and so the construct cannot be meaningfully tested or construed across groups or across time. Hence, prior to testing mean differences across groups or measurement occasions (e.g., boys and girls, pretest and posttest), or differential relations of the construct across groups, it is essential to assess the invariance of the construct. Conventions and reporting on measurement invariance are still in flux, and researchers are often left with limited understanding and inconsistent advice. Measurement invariance is tested and established in different steps. This report surveys the state of measurement invariance testing and reporting, and details the results of a literature review of studies that tested invariance. Most tests of measurement invariance include configural, metric, and scalar steps; a residual invariance step is reported for fewer tests. Alternative fit indices (AFIs) are reported as model fit criteria for the vast majority of tests; χ2 is reported as the single index in a minority of invariance tests. Reporting AFIs is associated with higher levels of achieved invariance. Partial invariance is reported for about one-third of tests. In general, sample size, number of groups compared, and model size are unrelated to the level of invariance achieved. Implications for the future of measurement invariance testing, reporting, and best practices are discussed. PMID:27942093

  4. Can a Mediator Moderate? Considering the Role of Time and Change in the Mediator-Moderator Distinction.

    PubMed

    Karazsia, Bryan T; Berlin, Kristoffer S

    2018-01-01

    The concepts of mediation and moderation are important for specifying ways in which psychological treatments work and for whom they are most beneficial. Historically, the terms were confused and used interchangeably, so a rich body of scholarly literature makes clear their distinction. Researchers are also becoming increasingly aware that mediation and moderation can be integrated and that such integration can advance theory development and testing. One question that has not received sufficient attention is whether a mediator can simultaneously moderate. We tackle this question in this paper, and in doing so we expand on the MacArthur conceptualizations of mediation and moderation. The result is a presentation of a meta-theoretical model that illustrates how a construct that is initially a mediator can, not simultaneously but over time, evolve into a construct that moderates. When this occurs, a construct that changed for the better as a result of an intervention can later promote more positive change during a later intervention. Various implications of this novel paradigm for future research are discussed, including the importance of this model in the emerging context of managed health care. Copyright © 2017. Published by Elsevier Ltd.

  5. Data-Driven Modeling of Complex Systems by means of a Dynamical ANN

    NASA Astrophysics Data System (ADS)

    Seleznev, A.; Mukhin, D.; Gavrilov, A.; Loskutov, E.; Feigin, A.

    2017-12-01

    The data-driven methods for modeling and prognosis of complex dynamical systems become more and more popular in various fields due to growth of high-resolution data. We distinguish the two basic steps in such an approach: (i) determining the phase subspace of the system, or embedding, from available time series and (ii) constructing an evolution operator acting in this reduced subspace. In this work we suggest a novel approach combining these two steps by means of construction of an artificial neural network (ANN) with special topology. The proposed ANN-based model, on the one hand, projects the data onto a low-dimensional manifold, and, on the other hand, models a dynamical system on this manifold. Actually, this is a recurrent multilayer ANN which has internal dynamics and capable of generating time series. Very important point of the proposed methodology is the optimization of the model allowing us to avoid overfitting: we use Bayesian criterion to optimize the ANN structure and estimate both the degree of evolution operator nonlinearity and the complexity of nonlinear manifold which the data are projected on. The proposed modeling technique will be applied to the analysis of high-dimensional dynamical systems: Lorenz'96 model of atmospheric turbulence, producing high-dimensional space-time chaos, and quasi-geostrophic three-layer model of the Earth's atmosphere with the natural orography, describing the dynamics of synoptical vortexes as well as mesoscale blocking systems. The possibility of application of the proposed methodology to analyze real measured data is also discussed. The study was supported by the Russian Science Foundation (grant #16-12-10198).

  6. A Tree Based Broadcast Scheme for (m, k)-firm Real-Time Stream in Wireless Sensor Networks.

    PubMed

    Park, HoSung; Kim, Beom-Su; Kim, Kyong Hoon; Shah, Babar; Kim, Ki-Il

    2017-11-09

    Recently, various unicast routing protocols have been proposed to deliver measured data from the sensor node to the sink node within the predetermined deadline in wireless sensor networks. In parallel with their approaches, some applications demand the specific service, which is based on broadcast to all nodes within the deadline, the feasible real-time traffic model and improvements in energy efficiency. However, current protocols based on either flooding or one-to-one unicast cannot meet the above requirements entirely. Moreover, as far as the authors know, there is no study for the real-time broadcast protocol to support the application-specific traffic model in WSN yet. Based on the above analysis, in this paper, we propose a new ( m , k )-firm-based Real-time Broadcast Protocol (FRBP) by constructing a broadcast tree to satisfy the ( m , k )-firm, which is applicable to the real-time model in resource-constrained WSNs. The broadcast tree in FRBP is constructed by the distance-based priority scheme, whereas energy efficiency is improved by selecting as few as nodes on a tree possible. To overcome the unstable network environment, the recovery scheme invokes rapid partial tree reconstruction in order to designate another node as the parent on a tree according to the measured ( m , k )-firm real-time condition and local states monitoring. Finally, simulation results are given to demonstrate the superiority of FRBP compared to the existing schemes in terms of average deadline missing ratio, average throughput and energy consumption.

  7. Reading Time as Evidence for Mental Models in Understanding Physics

    NASA Astrophysics Data System (ADS)

    Brookes, David T.; Mestre, José; Stine-Morrow, Elizabeth A. L.

    2007-11-01

    We present results of a reading study that show the usefulness of probing physics students' cognitive processing by measuring reading time. According to contemporary discourse theory, when people read a text, a network of associated inferences is activated to create a mental model. If the reader encounters an idea in the text that conflicts with existing knowledge, the construction of a coherent mental model is disrupted and reading times are prolonged, as measured using a simple self-paced reading paradigm. We used this effect to study how "non-Newtonian" and "Newtonian" students create mental models of conceptual systems in physics as they read texts related to the ideas of Newton's third law, energy, and momentum. We found significant effects of prior knowledge state on patterns of reading time, suggesting that students attempt to actively integrate physics texts with their existing knowledge.

  8. PROGRESS REPORT: COFIRING PROJECTS FOR WILLOW ISLAND AND ALBRIGHT GENERATING STATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    K. Payette; D. Tillman

    During the period April 1, 2001--June 30, 2001, Allegheny Energy Supply Co., LLC (Allegheny) accelerated construction of the Willow Island cofiring project, completed the installation of foundations for the fuel storage facility, the fuel receiving facility, and the processing building. Allegheny received all processing equipment to be installed at Willow Island. Allegheny completed the combustion modeling for the Willow Island project. During this time period construction of the Albright Generating Station cofiring facility was completed, with few items left for final action. The facility was dedicated at a ceremony on June 29. Initial testing of cofiring at the facility commenced.more » This report summarizes the activities associated with the Designer Opportunity Fuel program, and demonstrations at Willow Island and Albright Generating Stations. It details the construction activities at both sites along with the combustion modeling at the Willow Island site.« less

  9. Constructing a Time-Invariant Measure of the Socio-economic Status of U.S. Census Tracts.

    PubMed

    Miles, Jeremy N; Weden, Margaret M; Lavery, Diana; Escarce, José J; Cagney, Kathleen A; Shih, Regina A

    2016-02-01

    Contextual research on time and place requires a consistent measurement instrument for neighborhood conditions in order to make unbiased inferences about neighborhood change. We develop such a time-invariant measure of neighborhood socio-economic status (NSES) using exploratory and confirmatory factor analyses fit to census data at the tract level from the 1990 and 2000 U.S. Censuses and the 2008-2012 American Community Survey. A single factor model fit the data well at all three time periods, and factor loadings--but not indicator intercepts--could be constrained to equality over time without decrement to fit. After addressing remaining longitudinal measurement bias, we found that NSES increased from 1990 to 2000, and then--consistent with the timing of the "Great Recession"--declined in 2008-2012 to a level approaching that of 1990. Our approach for evaluating and adjusting for time-invariance is not only instructive for studies of NSES but also more generally for longitudinal studies in which the variable of interest is a latent construct.

  10. A temporal-spatial postprocessing model for probabilistic run-off forecast. With a case study from Ulla-Førre with five catchments and ten lead times

    NASA Astrophysics Data System (ADS)

    Engeland, K.; Steinsland, I.

    2012-04-01

    This work is driven by the needs of next generation short term optimization methodology for hydro power production. Stochastic optimization are about to be introduced; i.e. optimizing when available resources (water) and utility (prices) are uncertain. In this paper we focus on the available resources, i.e. water, where uncertainty mainly comes from uncertainty in future runoff. When optimizing a water system all catchments and several lead times have to be considered simultaneously. Depending on the system of hydropower reservoirs, it might be a set of headwater catchments, a system of upstream /downstream reservoirs where water used from one catchment /dam arrives in a lower catchment maybe days later, or a combination of both. The aim of this paper is therefore to construct a simultaneous probabilistic forecast for several catchments and lead times, i.e. to provide a predictive distribution for the forecasts. Stochastic optimization methods need samples/ensembles of run-off forecasts as input. Hence, it should also be possible to sample from our probabilistic forecast. A post-processing approach is taken, and an error model based on Box- Cox transformation, power transform and a temporal-spatial copula model is used. It accounts for both between catchment and between lead time dependencies. In operational use it is strait forward to sample run-off ensembles from this models that inherits the catchment and lead time dependencies. The methodology is tested and demonstrated in the Ulla-Førre river system, and simultaneous probabilistic forecasts for five catchments and ten lead times are constructed. The methodology has enough flexibility to model operationally important features in this case study such as hetroscadasety, lead-time varying temporal dependency and lead-time varying inter-catchment dependency. Our model is evaluated using CRPS for marginal predictive distributions and energy score for joint predictive distribution. It is tested against deterministic run-off forecast, climatology forecast and a persistent forecast, and is found to be the better probabilistic forecast for lead time grater then two. From an operational point of view the results are interesting as the between catchment dependency gets stronger with longer lead-times.

  11. Cascading Oscillators in Decoding Speech: Reflection of a Cortical Computation Principle

    DTIC Science & Technology

    2016-09-06

    Combining an experimental paradigm based on Ghitza and Greenberg (2009) for speech with the approach of Farbood et al. (2013) to timing in key...Fuglsang, 2015). A model was developed which uses modulation spectrograms to construct an oscillating time - series synchronized with the slowly varying...estimated to average 1 hour per response, including the time for reviewing instructions, searching data sources, gathering and maintaining the data

  12. [Spatiotemporal differentiation of construction land expansion in a typical town of south Jiangsu Province].

    PubMed

    Zhou, Rui; Li, Yue-hui; Hu, Yuan-man; Su, Hai-long; Wang, Jin-nian

    2011-03-01

    Choosing Xinzhuang Town in south Jiangsu Province as study area, and by using 1980, 1991, 2001, and 2009 high-resolution remote sensing images and GIS spatial analysis technology, an integrated expansion degree index model was established based on the existing indicators of construction land expansion, and the general and spatiotemporal differentiation characteristics of construction land expansion in the Town in three time periods of 1980-2009 were quantitatively analyzed. In 1980-2009, with the acceleration of rural urbanization and industrialization, the area of construction land in the Town increased significantly by 19.24 km2, and especially in 2001-2009, the expanded area, expanded contribution rate, and expansion intensity reached the maximum. The construction land expansion had an obvious spatial differentiation characteristic. In 1980-1991, the newly increased construction land mainly concentrated in town area. After 1991, the focus of construction land gradually spread to the villages with developed industries. Most of the increased construction lands were converted from paddy field and dry land, accounting for 88.1% of the total increased area, while the contribution from other land types was relatively small.

  13. [Characteristics of fugitive dust emission from paved road near construction activities].

    PubMed

    Tian, Gang; Fan, Shou-Bin; Li, Gang; Qin, Jian-Ping

    2007-11-01

    Because of the mud/dirt carryout from construction activities, the silt loading of paved road nearby is higher and the fugitive dust emission is stronger. By sampling and laboratory analysis of the road surface dust samples, we obtain the silt loading (mass of material equal to or less than 75 micromaters in physical diameter per unit area of travel surface) of paved roads near construction activities. The result show that silt loading of road near construction activities is higher than "normal road", and silt loading is negatively correlated with length from construction's door. According to AP-42 emission factor model of fugitive dust from roads, the emission factor of influenced road is 2 - 10 times bigger than "normal road", and the amount of fugitive dust emission influenced by one construction activity is "equivalent" to an additional road length of approximately 422 - 3 800 m with the baseline silt loading. Based on the spatial and temporal distribution of construction activities, in 2002 the amount of PM10 emission influenced by construction activities in Beijing city areas account of for 59% of fugitive dust from roads.

  14. 40 CFR 60.1880 - When must I submit the annual report?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of each year that follows the calendar year in which you collected the data. If you have an operating... Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule...

  15. Construction of Fluid - solid Coupling Model with Improved Richards - BP & Its Engineering Application

    NASA Astrophysics Data System (ADS)

    Xie, Chengyu; Jia, Nan; Shi, Dongping; Lu, Hao

    2017-10-01

    In order to study the slurry diffusion law during grouting, Richards unsaturated-saturated model was introduced, the definition of the grouting model is clear, the Richards model control equation was established, And the BP neural network was introduced, the improved fluid-solid coupling model was constructed, Through the use of saturated - unsaturated seepage flow model, As well as the overflow boundary iterative solution of the mixed boundary conditions, the free surface is calculated. Engineering practice for an example, with the aid of multi - field coupling analysis software, the diffusion law of slurry was simulated numerically. The results show that the slurry diffusion rule is affected by grouting material, initial pressure and other factors. When the slurry starts, it flows in the cracks along the upper side of the grouting hole, when the pressure gradient is reduced to the critical pressure, that is, to the lower side of the flow, when the slurry diffusion stability, and ultimately its shape like an 8. The slurry is spread evenly from the overall point of view, from the grouting mouth toward the surrounding evenly spread, it gradually reaches saturation by non-saturation, and it is not a purely saturated flow, when the slurry spread and reach a saturated state, the diffusion time is the engineering grouting time.

  16. Use of formwork systems in high-rise construction

    NASA Astrophysics Data System (ADS)

    Kurakova, Oksana

    2018-03-01

    Erection of high quality buildings and structures within a reasonable time frame is the crucial factor for the competitiveness of any construction organization. The main material used in high-rise construction is insitu reinforced concrete. The technology of its use is directly related to the use of formwork systems. Formwork systems and formwork technologies basically determine the speed of construction and labor intensity of concreting operations. Therefore, it is also possible to achieve the goal of reducing the construction time and labor intensity of works performed by improving the technology of formwork systems use. Currently there are unresolved issues in the area of implementation of monolithic technology projects, and problems related to the selection of a formwork technology, high labor intensity of works, poor quality of materials and structures, etc. are the main ones. The article presents organizational and technological measures, by means of which introduction it is possible to shorten the duration of construction. A comparison of operations performed during formwork installation according to the conventional technology and taking into account the implemented organizational and technological measures is presented. The results of a comparative analysis of economic efficiency assessments are also presented on the example of a specific construction project before and after the implementation of the above mentioned measures. The study showed that introduction of the proposed organizational and technological model taking into account optimization of reinforcing and concreting works significantly improves the efficiency of a high-rise construction project. And further improvement of technologies for the use of insitu reinforced concrete is a promising direction in the construction of high-rise buildings.

  17. Quality prediction modeling for sintered ores based on mechanism models of sintering and extreme learning machine based error compensation

    NASA Astrophysics Data System (ADS)

    Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang

    2018-06-01

    Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.

  18. Stochastic Car-Following Model for Explaining Nonlinear Traffic Phenomena

    NASA Astrophysics Data System (ADS)

    Meng, Jianping; Song, Tao; Dong, Liyun; Dai, Shiqiang

    There is a common time parameter for representing the sensitivity or the lag (response) time of drivers in many car-following models. In the viewpoint of traffic psychology, this parameter could be considered as the perception-response time (PRT). Generally, this parameter is set to be a constant in previous models. However, PRT is actually not a constant but a random variable described by the lognormal distribution. Thus the probability can be naturally introduced into car-following models by recovering the probability of PRT. For demonstrating this idea, a specific stochastic model is constructed based on the optimal velocity model. By conducting simulations under periodic boundary conditions, it is found that some important traffic phenomena, such as the hysteresis and phantom traffic jams phenomena, can be reproduced more realistically. Especially, an interesting experimental feature of traffic jams, i.e., two moving jams propagating in parallel with constant speed stably and sustainably, is successfully captured by the present model.

  19. High-Dimensional Bayesian Geostatistics

    PubMed Central

    Banerjee, Sudipto

    2017-01-01

    With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as “priors” for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ~ n floating point operations (flops), where n the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings. PMID:29391920

  20. High-Dimensional Bayesian Geostatistics.

    PubMed

    Banerjee, Sudipto

    2017-06-01

    With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as "priors" for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ~ n floating point operations (flops), where n the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings.

  1. The Classification of Universes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bjorken, J

    2004-04-09

    We define a universe as the contents of a spacetime box with comoving walls, large enough to contain essentially all phenomena that can be conceivably measured. The initial time is taken as the epoch when the lowest CMB modes undergo horizon crossing, and the final time taken when the wavelengths of CMB photons are comparable with the Hubble scale, i.e. with the nominal size of the universe. This allows the definition of a local ensemble of similarly constructed universes, using only modest extrapolations of the observed behavior of the cosmos. We then assume that further out in spacetime, similar universesmore » can be constructed but containing different standard model parameters. Within this multiverse ensemble, it is assumed that the standard model parameters are strongly correlated with size, i.e. with the value of the inverse Hubble parameter at the final time, in a manner as previously suggested. This allows an estimate of the range of sizes which allow life as we know it, and invites a speculation regarding the most natural distribution of sizes. If small sizes are favored, this in turn allows some understanding of the hierarchy problems of particle physics. Subsequent sections of the paper explore other possible implications. In all cases, the approach is as bottoms up and as phenomenological as possible, and suggests that theories of the multiverse so constructed may in fact lay some claim of being scientific.« less

  2. A cellular automata model of Ebola virus dynamics

    NASA Astrophysics Data System (ADS)

    Burkhead, Emily; Hawkins, Jane

    2015-11-01

    We construct a stochastic cellular automaton (SCA) model for the spread of the Ebola virus (EBOV). We make substantial modifications to an existing SCA model used for HIV, introduced by others and studied by the authors. We give a rigorous analysis of the similarities between models due to the spread of virus and the typical immune response to it, and the differences which reflect the drastically different timing of the course of EBOV. We demonstrate output from the model and compare it with clinical data.

  3. Modeling asset price processes based on mean-field framework

    NASA Astrophysics Data System (ADS)

    Ieda, Masashi; Shiino, Masatoshi

    2011-12-01

    We propose a model of the dynamics of financial assets based on the mean-field framework. This framework allows us to construct a model which includes the interaction among the financial assets reflecting the market structure. Our study is on the cutting edge in the sense of a microscopic approach to modeling the financial market. To demonstrate the effectiveness of our model concretely, we provide a case study, which is the pricing problem of the European call option with short-time memory noise.

  4. Processing on weak electric signals by the autoregressive model

    NASA Astrophysics Data System (ADS)

    Ding, Jinli; Zhao, Jiayin; Wang, Lanzhou; Li, Qiao

    2008-10-01

    A model of the autoregressive model of weak electric signals in two plants was set up for the first time. The result of the AR model to forecast 10 values of the weak electric signals is well. It will construct a standard set of the AR model coefficient of the plant electric signal and the environmental factor, and can be used as the preferences for the intelligent autocontrol system based on the adaptive characteristic of plants to achieve the energy saving on agricultural productions.

  5. Adapting Gel Wax into an Ultrasound-Guided Pericardiocentesis Model at Low Cost

    PubMed Central

    Daly, Robert; Planas, Jason H.; Edens, Mary Ann

    2017-01-01

    Cardiac tamponade is a life-threatening emergency for which pericardiocentesis may be required. Real-time bedside ultrasound has obviated the need for routine blind procedures in cardiac arrest, and the number of pericardiocenteses being performed has declined. Despite this fact, pericardiocentesis remains an essential skill in emergency medicine. While commercially available training models exist, cost, durability, and lack of anatomical landmarks limit their usefulness. We sought to create a pericardiocentesis model that is realistic, simple to build, reusable, and cost efficient. We constructed the model using a red dye-filled ping pong ball (simulating the right ventricle) and a 250cc normal saline bag (simulating the effusion) encased in an artificial rib cage and held in place by gel wax. The inner saline bag was connected to a 1L saline bag outside of the main assembly to act as a fluid reservoir for repeat uses. The entire construction process takes approximately 16–20 hours, most of which is attributed to cooling of the gel wax. Actual construction time is approximately four hours at a cost of less than $200. The model was introduced to emergency medicine residents and medical students during a procedure simulation lab and compared to a model previously described by dell’Orto.1 The learners performed ultrasound-guided pericardiocentesis using both models. Learners who completed a survey comparing realism of the two models felt our model was more realistic than the previously described model. On a scale of 1–9, with 9 being very realistic, the previous model was rated a 4.5. Our model was rated a 7.8. There was also a marked improvement in the perceived recognition of the pericardium, the heart, and the pericardial sac. Additionally, 100% of the students were successful at performing the procedure using our model. In simulation, our model provided both palpable and ultrasound landmarks and held up to several months of repeated use. It was less expensive than commercial models ($200 vs up to $16,500) while being more realistic in simulation than other described “do-it-yourself models.” This model can be easily replicated to teach the necessary skill of pericardiocentesis. PMID:28116020

  6. Slightly uneven electric field trigatron employed in tens of microseconds charging time.

    PubMed

    Lin, Jiajin; Yang, Jianhua; Zhang, Jiande; Zhang, Huibo; Yang, Xiao

    2014-09-01

    To solve the issue of operation instability for the trigatron switch in the application of tens of microseconds or even less charging time, a novel trigatron spark gap with slightly uneven electric field was presented. Compared with the conventional trigatron, the novel trigatron was constructed with an obvious field enhancement on the edge of the opposite electrode. The selection of the field enhancement was analyzed based on the theory introduced by Martin. A low voltage trigatron model was constructed and tested on the tens of microseconds charging time platform. The results show that the character of relative range was improved while the trigger character still held a high level. This slightly uneven electric field typed trigatron is willing to be employed in the Tesla transformer - pulse forming line system.

  7. The construction of next-generation matrices for compartmental epidemic models.

    PubMed

    Diekmann, O; Heesterbeek, J A P; Roberts, M G

    2010-06-06

    The basic reproduction number (0) is arguably the most important quantity in infectious disease epidemiology. The next-generation matrix (NGM) is the natural basis for the definition and calculation of (0) where finitely many different categories of individuals are recognized. We clear up confusion that has been around in the literature concerning the construction of this matrix, specifically for the most frequently used so-called compartmental models. We present a detailed easy recipe for the construction of the NGM from basic ingredients derived directly from the specifications of the model. We show that two related matrices exist which we define to be the NGM with large domain and the NGM with small domain. The three matrices together reflect the range of possibilities encountered in the literature for the characterization of (0). We show how they are connected and how their construction follows from the basic model ingredients, and establish that they have the same non-zero eigenvalues, the largest of which is the basic reproduction number (0). Although we present formal recipes based on linear algebra, we encourage the construction of the NGM by way of direct epidemiological reasoning, using the clear interpretation of the elements of the NGM and of the model ingredients. We present a selection of examples as a practical guide to our methods. In the appendix we present an elementary but complete proof that (0) defined as the dominant eigenvalue of the NGM for compartmental systems and the Malthusian parameter r, the real-time exponential growth rate in the early phase of an outbreak, are connected by the properties that (0) > 1 if and only if r > 0, and (0) = 1 if and only if r = 0.

  8. The construction of next-generation matrices for compartmental epidemic models

    PubMed Central

    Diekmann, O.; Heesterbeek, J. A. P.; Roberts, M. G.

    2010-01-01

    The basic reproduction number ℛ0 is arguably the most important quantity in infectious disease epidemiology. The next-generation matrix (NGM) is the natural basis for the definition and calculation of ℛ0 where finitely many different categories of individuals are recognized. We clear up confusion that has been around in the literature concerning the construction of this matrix, specifically for the most frequently used so-called compartmental models. We present a detailed easy recipe for the construction of the NGM from basic ingredients derived directly from the specifications of the model. We show that two related matrices exist which we define to be the NGM with large domain and the NGM with small domain. The three matrices together reflect the range of possibilities encountered in the literature for the characterization of ℛ0. We show how they are connected and how their construction follows from the basic model ingredients, and establish that they have the same non-zero eigenvalues, the largest of which is the basic reproduction number ℛ0. Although we present formal recipes based on linear algebra, we encourage the construction of the NGM by way of direct epidemiological reasoning, using the clear interpretation of the elements of the NGM and of the model ingredients. We present a selection of examples as a practical guide to our methods. In the appendix we present an elementary but complete proof that ℛ0 defined as the dominant eigenvalue of the NGM for compartmental systems and the Malthusian parameter r, the real-time exponential growth rate in the early phase of an outbreak, are connected by the properties that ℛ0 > 1 if and only if r > 0, and ℛ0 = 1 if and only if r = 0. PMID:19892718

  9. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    PubMed

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  10. Scrub Typhus Incidence Modeling with Meteorological Factors in South Korea.

    PubMed

    Kwak, Jaewon; Kim, Soojun; Kim, Gilho; Singh, Vijay P; Hong, Seungjin; Kim, Hung Soo

    2015-06-29

    Since its recurrence in 1986, scrub typhus has been occurring annually and it is considered as one of the most prevalent diseases in Korea. Scrub typhus is a 3rd grade nationally notifiable disease that has greatly increased in Korea since 2000. The objective of this study is to construct a disease incidence model for prediction and quantification of the incidences of scrub typhus. Using data from 2001 to 2010, the incidence Artificial Neural Network (ANN) model, which considers the time-lag between scrub typhus and minimum temperature, precipitation and average wind speed based on the Granger causality and spectral analysis, is constructed and tested for 2011 to 2012. Results show reliable simulation of scrub typhus incidences with selected predictors, and indicate that the seasonality in meteorological data should be considered.

  11. Scrub Typhus Incidence Modeling with Meteorological Factors in South Korea

    PubMed Central

    Kwak, Jaewon; Kim, Soojun; Kim, Gilho; Singh, Vijay P.; Hong, Seungjin; Kim, Hung Soo

    2015-01-01

    Since its recurrence in 1986, scrub typhus has been occurring annually and it is considered as one of the most prevalent diseases in Korea. Scrub typhus is a 3rd grade nationally notifiable disease that has greatly increased in Korea since 2000. The objective of this study is to construct a disease incidence model for prediction and quantification of the incidences of scrub typhus. Using data from 2001 to 2010, the incidence Artificial Neural Network (ANN) model, which considers the time-lag between scrub typhus and minimum temperature, precipitation and average wind speed based on the Granger causality and spectral analysis, is constructed and tested for 2011 to 2012. Results show reliable simulation of scrub typhus incidences with selected predictors, and indicate that the seasonality in meteorological data should be considered. PMID:26132479

  12. Hybridizing Gravitationl Waveforms of Inspiralling Binary Neutron Star Systems

    NASA Astrophysics Data System (ADS)

    Cullen, Torrey; LIGO Collaboration

    2016-03-01

    Gravitational waves are ripples in space and time and were predicted to be produced by astrophysical systems such as binary neutron stars by Albert Einstein. These are key targets for Laser Interferometer and Gravitational Wave Observatory (LIGO), which uses template waveforms to find weak signals. The simplified template models are known to break down at high frequency, so I wrote code that constructs hybrid waveforms from numerical simulations to accurately cover a large range of frequencies. These hybrid waveforms use Post Newtonian template models at low frequencies and numerical data from simulations at high frequencies. They are constructed by reading in existing Post Newtonian models with the same masses as simulated stars, reading in the numerical data from simulations, and finding the ideal frequency and alignment to ``stitch'' these waveforms together.

  13. Diurnal Transcriptome and Gene Network Represented through Sparse Modeling in Brachypodium distachyon.

    PubMed

    Koda, Satoru; Onda, Yoshihiko; Matsui, Hidetoshi; Takahagi, Kotaro; Yamaguchi-Uehara, Yukiko; Shimizu, Minami; Inoue, Komaki; Yoshida, Takuhiro; Sakurai, Tetsuya; Honda, Hiroshi; Eguchi, Shinto; Nishii, Ryuei; Mochida, Keiichi

    2017-01-01

    We report the comprehensive identification of periodic genes and their network inference, based on a gene co-expression analysis and an Auto-Regressive eXogenous (ARX) model with a group smoothly clipped absolute deviation (SCAD) method using a time-series transcriptome dataset in a model grass, Brachypodium distachyon . To reveal the diurnal changes in the transcriptome in B. distachyon , we performed RNA-seq analysis of its leaves sampled through a diurnal cycle of over 48 h at 4 h intervals using three biological replications, and identified 3,621 periodic genes through our wavelet analysis. The expression data are feasible to infer network sparsity based on ARX models. We found that genes involved in biological processes such as transcriptional regulation, protein degradation, and post-transcriptional modification and photosynthesis are significantly enriched in the periodic genes, suggesting that these processes might be regulated by circadian rhythm in B. distachyon . On the basis of the time-series expression patterns of the periodic genes, we constructed a chronological gene co-expression network and identified putative transcription factors encoding genes that might be involved in the time-specific regulatory transcriptional network. Moreover, we inferred a transcriptional network composed of the periodic genes in B. distachyon , aiming to identify genes associated with other genes through variable selection by grouping time points for each gene. Based on the ARX model with the group SCAD regularization using our time-series expression datasets of the periodic genes, we constructed gene networks and found that the networks represent typical scale-free structure. Our findings demonstrate that the diurnal changes in the transcriptome in B. distachyon leaves have a sparse network structure, demonstrating the spatiotemporal gene regulatory network over the cyclic phase transitions in B. distachyon diurnal growth.

  14. Ocean Models and Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Salas-de-Leon, D. A.

    2007-05-01

    The increasing computational developments and the better understanding of mathematical and physical systems resulted in an increasing number of ocean models. Long time ago, modelers were like a secret organization and recognize each other by using secret codes and languages that only a select group of people was able to recognize and understand. The access to computational systems was reduced, on one hand equipment and the using time of computers were expensive and restricted, and on the other hand, they required an advance computational languages that not everybody wanted to learn. Now a days most college freshman own a personal computer (PC or laptop), and/or have access to more sophisticated computational systems than those available for research in the early 80's. The resource availability resulted in a mayor access to all kind models. Today computer speed and time and the algorithms does not seem to be a problem, even though some models take days to run in small computational systems. Almost every oceanographic institution has their own model, what is more, in the same institution from one office to the next there are different models for the same phenomena, developed by different research member, the results does not differ substantially since the equations are the same, and the solving algorithms are similar. The algorithms and the grids, constructed with algorithms, can be found in text books and/or over the internet. Every year more sophisticated models are constructed. The Proper Orthogonal Decomposition is a technique that allows the reduction of the number of variables to solve keeping the model properties, for which it can be a very useful tool in diminishing the processes that have to be solved using "small" computational systems, making sophisticated models available for a greater community.

  15. Probabilistic arithmetic automata and their applications.

    PubMed

    Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven

    2012-01-01

    We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.

  16. Predicting sugar-sweetened behaviours with theory of planned behaviour constructs: Outcome and process results from the SIPsmartER behavioural intervention

    PubMed Central

    Zoellner, Jamie M.; Porter, Kathleen J.; Chen, Yvonnes; Hedrick, Valisa E.; You, Wen; Hickman, Maja; Estabrooks, Paul A.

    2017-01-01

    Objective Guided by the theory of planned behaviour (TPB) and health literacy concepts, SIPsmartER is a six-month multicomponent intervention effective at improving SSB behaviours. Using SIPsmartER data, this study explores prediction of SSB behavioural intention (BI) and behaviour from TPB constructs using: (1) cross-sectional and prospective models and (2) 11 single-item assessments from interactive voice response (IVR) technology. Design Quasi-experimental design, including pre- and post-outcome data and repeated-measures process data of 155 intervention participants. Main Outcome Measures Validated multi-item TPB measures, single-item TPB measures, and self-reported SSB behaviours. Hypothesised relationships were investigated using correlation and multiple regression models. Results TPB constructs explained 32% of the variance cross sectionally and 20% prospectively in BI; and explained 13–20% of variance cross sectionally and 6% prospectively. Single-item scale models were significant, yet explained less variance. All IVR models predicting BI (average 21%, range 6–38%) and behaviour (average 30%, range 6–55%) were significant. Conclusion Findings are interpreted in the context of other cross-sectional, prospective and experimental TPB health and dietary studies. Findings advance experimental application of the TPB, including understanding constructs at outcome and process time points and applying theory in all intervention development, implementation and evaluation phases. PMID:28165771

  17. A longitudinal study of socioeconomic status, family processes, and child adjustment from preschool until early elementary school: the role of social competence.

    PubMed

    Hosokawa, Rikuya; Katsura, Toshiki

    2017-01-01

    Using a short-term longitudinal design, this study examined the concurrent and longitudinal relationships among familial socioeconomic status (SES; i.e., family income and maternal and paternal education levels), marital conflict (i.e., constructive and destructive marital conflict), parenting practices (i.e., positive and negative parenting practices), child social competence (i.e., social skills), and child behavioral adjustment (i.e., internalizing and externalizing problems) in a comprehensive model. The sample included a total of 1604 preschoolers aged 5 years at Time 1 and first graders aged 6 years at Time 2 (51.5% male). Parents completed a self-reported questionnaire regarding their SES, marital conflict, parenting practices, and their children's behavioral adjustment. Teachers also evaluated the children's social competence. The path analysis results revealed that Time 1 family income and maternal and paternal education levels were respectively related to Time 1 social skills and Time 2 internalizing and externalizing problems, both directly and indirectly, through their influence on destructive and constructive marital conflict, as well as negative and positive parenting practices. Notably, after controlling for Time 1 behavioral problems as mediating mechanisms in the link between family factors (i.e., SES, marital conflict, and parenting practices) and behavioral adjustment, Time 1 social skills significantly and inversely influenced both the internalization and externalization of problems at Time 2. The merit of examining SES, marital conflict, and parenting practices as multidimensional constructs is discussed in relation to an understanding of processes and pathways within families that affect child mental health functioning. The results suggest social competence, which is influenced by the multidimensional constructs of family factors, may prove protective in reducing the risk of child maladjustment, especially for children who are socioeconomically disadvantaged.

  18. Analysis of precision and accuracy in a simple model of machine learning

    NASA Astrophysics Data System (ADS)

    Lee, Julian

    2017-12-01

    Machine learning is a procedure where a model for the world is constructed from a training set of examples. It is important that the model should capture relevant features of the training set, and at the same time make correct prediction for examples not included in the training set. I consider the polynomial regression, the simplest method of learning, and analyze the accuracy and precision for different levels of the model complexity.

  19. A new biochromatography model based on DNA origami assembled PPARγ: construction and evaluation.

    PubMed

    Zhou, Jie; Meng, Lingchang; Sun, Chong; Chen, Shanshan; Sun, Fang; Luo, Pei; Zhao, Yongxing

    2017-05-01

    As drug targets, receptors have potential to screen drugs. Silica is an attractive support to immobilize receptors; however, the lack of biocompatibility makes it easier for receptors to lose bioactivity, which remains an obstacle to its widespread use. With the advantage of biocompatibility, DNA origami can be used as a biological carrier to improve the biocompatibility of silica and assemble receptors. In this study, a new biochromatography model based on DNA origami was constructed. A large quantity of M13ssDNA was used as a scaffold, leading to significant costs, so M13ssDNA was self-produced from the bacteriophage particles. This approach is demonstrated using the ligand binding domain of gamma isoform peroxisome proliferator-activated receptor (PPARγ-LBD) as a research object. PPARγ-LBD was assembled on DNA origami carrier and then coupled on the surface of silica. The products were packed into the column as stationary phase to construct the biochromatography with the ability to recognize drugs. Affinity and specificity of the biochromatography model were evaluated by HPLC. The final results showed that the biochromatography could recognize rosiglitazone specifically, which further proved that the model could screen chemical compositions interacted with PPARγ. It was the first time to take advantage of DNA origami to assemble PPARγ to construct biochromatography. The new biochromatography model has the advantages of being efficient, convenient, and high-throughput. This method affords a new way to rapidly and conveniently screen active ingredients from complex sample plant extracts and natural product-like libraries.

  20. Identifying Potential Norovirus Epidemics in China via Internet Surveillance

    PubMed Central

    Chen, Bin; Jiang, Tao; Cai, Gaofeng; Jiang, Zhenggang; Chen, Yongdi; Wang, Zhengting; Gu, Hua; Chai, Chengliang

    2017-01-01

    Background Norovirus is a common virus that causes acute gastroenteritis worldwide, but a monitoring system for norovirus is unavailable in China. Objective We aimed to identify norovirus epidemics through Internet surveillance and construct an appropriate model to predict potential norovirus infections. Methods The norovirus-related data of a selected outbreak in Jiaxing Municipality, Zhejiang Province of China, in 2014 were collected from immediate epidemiological investigation, and the Internet search volume, as indicated by the Baidu Index, was acquired from the Baidu search engine. All correlated search keywords in relation to norovirus were captured, screened, and composited to establish the composite Baidu Index at different time lags by Spearman rank correlation. The optimal model was chosen and possibly predicted maps in Zhejiang Province were presented by ArcGIS software. Results The combination of two vital keywords at a time lag of 1 day was ultimately identified as optimal (ρ=.924, P<.001). The exponential curve model was constructed to fit the trend of this epidemic, suggesting that a one-unit increase in the mean composite Baidu Index contributed to an increase of norovirus infections by 2.15 times during the outbreak. In addition to Jiaxing Municipality, Hangzhou Municipality might have had some potential epidemics in the study time from the predicted model. Conclusions Although there are limitations with early warning and unavoidable biases, Internet surveillance may be still useful for the monitoring of norovirus epidemics when a monitoring system is unavailable. PMID:28790023

  1. Wavelets and spacetime squeeze

    NASA Technical Reports Server (NTRS)

    Han, D.; Kim, Y. S.; Noz, Marilyn E.

    1993-01-01

    It is shown that the wavelet is the natural language for the Lorentz covariant description of localized light waves. A model for covariant superposition is constructed for light waves with different frequencies. It is therefore possible to construct a wave function for light waves carrying a covariant probability interpretation. It is shown that the time-energy uncertainty relation (Delta(t))(Delta(w)) is approximately 1 for light waves is a Lorentz-invariant relation. The connection between photons and localized light waves is examined critically.

  2. Fast mix table construction for material discretization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, S. R.

    2013-07-01

    An effective hybrid Monte Carlo-deterministic implementation typically requires the approximation of a continuous geometry description with a discretized piecewise-constant material field. The inherent geometry discretization error can be reduced somewhat by using material mixing, where multiple materials inside a discrete mesh voxel are homogenized. Material mixing requires the construction of a 'mix table,' which stores the volume fractions in every mixture so that multiple voxels with similar compositions can reference the same mixture. Mix table construction is a potentially expensive serial operation for large problems with many materials and voxels. We formulate an efficient algorithm to construct a sparse mixmore » table in O(number of voxels x log number of mixtures) time. The new algorithm is implemented in ADVANTG and used to discretize continuous geometries onto a structured Cartesian grid. When applied to an end-of-life MCNP model of the High Flux Isotope Reactor with 270 distinct materials, the new method improves the material mixing time by a factor of 100 compared to a naive mix table implementation. (authors)« less

  3. A three-component model of future time perspective across adulthood.

    PubMed

    Rohr, Margund K; John, Dennis T; Fung, Helene H; Lang, Frieder R

    2017-11-01

    Although extensive findings underscore the relevance of future time perspective (FTP) in the process of aging, the assumption of FTP as a unifactorial construct has been challenged. The present study explores the factorial structure of the FTP scale (Carstensen & Lang, 1996) as one of the most widely used measures (Ntotal = 2,170). Results support that FTP reflects a higher-order construct that consists of 3 interrelated components-Opportunity, Extension, and Constraint. It is suggested that the flexible usage of the FTP scale as an all compassing 10-item measure or with focus on specific components depends on the concrete research question. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. BioNSi: A Discrete Biological Network Simulator Tool.

    PubMed

    Rubinstein, Amir; Bracha, Noga; Rudner, Liat; Zucker, Noga; Sloin, Hadas E; Chor, Benny

    2016-08-05

    Modeling and simulation of biological networks is an effective and widely used research methodology. The Biological Network Simulator (BioNSi) is a tool for modeling biological networks and simulating their discrete-time dynamics, implemented as a Cytoscape App. BioNSi includes a visual representation of the network that enables researchers to construct, set the parameters, and observe network behavior under various conditions. To construct a network instance in BioNSi, only partial, qualitative biological data suffices. The tool is aimed for use by experimental biologists and requires no prior computational or mathematical expertise. BioNSi is freely available at http://bionsi.wix.com/bionsi , where a complete user guide and a step-by-step manual can also be found.

  5. Real-time characterization of partially observed epidemics using surrogate models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safta, Cosmin; Ray, Jaideep; Lefantzi, Sophia

    We present a statistical method, predicated on the use of surrogate models, for the 'real-time' characterization of partially observed epidemics. Observations consist of counts of symptomatic patients, diagnosed with the disease, that may be available in the early epoch of an ongoing outbreak. Characterization, in this context, refers to estimation of epidemiological parameters that can be used to provide short-term forecasts of the ongoing epidemic, as well as to provide gross information on the dynamics of the etiologic agent in the affected population e.g., the time-dependent infection rate. The characterization problem is formulated as a Bayesian inverse problem, and epidemiologicalmore » parameters are estimated as distributions using a Markov chain Monte Carlo (MCMC) method, thus quantifying the uncertainty in the estimates. In some cases, the inverse problem can be computationally expensive, primarily due to the epidemic simulator used inside the inversion algorithm. We present a method, based on replacing the epidemiological model with computationally inexpensive surrogates, that can reduce the computational time to minutes, without a significant loss of accuracy. The surrogates are created by projecting the output of an epidemiological model on a set of polynomial chaos bases; thereafter, computations involving the surrogate model reduce to evaluations of a polynomial. We find that the epidemic characterizations obtained with the surrogate models is very close to that obtained with the original model. We also find that the number of projections required to construct a surrogate model is O(10)-O(10{sup 2}) less than the number of samples required by the MCMC to construct a stationary posterior distribution; thus, depending upon the epidemiological models in question, it may be possible to omit the offline creation and caching of surrogate models, prior to their use in an inverse problem. The technique is demonstrated on synthetic data as well as observations from the 1918 influenza pandemic collected at Camp Custer, Michigan.« less

  6. High‐resolution trench photomosaics from image‐based modeling: Workflow and error analysis

    USGS Publications Warehouse

    Reitman, Nadine G.; Bennett, Scott E. K.; Gold, Ryan D.; Briggs, Richard; Duross, Christopher

    2015-01-01

    Photomosaics are commonly used to construct maps of paleoseismic trench exposures, but the conventional process of manually using image‐editing software is time consuming and produces undesirable artifacts and distortions. Herein, we document and evaluate the application of image‐based modeling (IBM) for creating photomosaics and 3D models of paleoseismic trench exposures, illustrated with a case‐study trench across the Wasatch fault in Alpine, Utah. Our results include a structure‐from‐motion workflow for the semiautomated creation of seamless, high‐resolution photomosaics designed for rapid implementation in a field setting. Compared with conventional manual methods, the IBM photomosaic method provides a more accurate, continuous, and detailed record of paleoseismic trench exposures in approximately half the processing time and 15%–20% of the user input time. Our error analysis quantifies the effect of the number and spatial distribution of control points on model accuracy. For this case study, an ∼87  m2 exposure of a benched trench photographed at viewing distances of 1.5–7 m yields a model with <2  cm root mean square error (rmse) with as few as six control points. Rmse decreases as more control points are implemented, but the gains in accuracy are minimal beyond 12 control points. Spreading control points throughout the target area helps to minimize error. We propose that 3D digital models and corresponding photomosaics should be standard practice in paleoseismic exposure archiving. The error analysis serves as a guide for future investigations that seek balance between speed and accuracy during photomosaic and 3D model construction.

  7. Highly efficient classification and identification of human pathogenic bacteria by MALDI-TOF MS.

    PubMed

    Hsieh, Sen-Yung; Tseng, Chiao-Li; Lee, Yun-Shien; Kuo, An-Jing; Sun, Chien-Feng; Lin, Yen-Hsiu; Chen, Jen-Kun

    2008-02-01

    Accurate and rapid identification of pathogenic microorganisms is of critical importance in disease treatment and public health. Conventional work flows are time-consuming, and procedures are multifaceted. MS can be an alternative but is limited by low efficiency for amino acid sequencing as well as low reproducibility for spectrum fingerprinting. We systematically analyzed the feasibility of applying MS for rapid and accurate bacterial identification. Directly applying bacterial colonies without further protein extraction to MALDI-TOF MS analysis revealed rich peak contents and high reproducibility. The MS spectra derived from 57 isolates comprising six human pathogenic bacterial species were analyzed using both unsupervised hierarchical clustering and supervised model construction via the Genetic Algorithm. Hierarchical clustering analysis categorized the spectra into six groups precisely corresponding to the six bacterial species. Precise classification was also maintained in an independently prepared set of bacteria even when the numbers of m/z values were reduced to six. In parallel, classification models were constructed via Genetic Algorithm analysis. A model containing 18 m/z values accurately classified independently prepared bacteria and identified those species originally not used for model construction. Moreover bacteria fewer than 10(4) cells and different species in bacterial mixtures were identified using the classification model approach. In conclusion, the application of MALDI-TOF MS in combination with a suitable model construction provides a highly accurate method for bacterial classification and identification. The approach can identify bacteria with low abundance even in mixed flora, suggesting that a rapid and accurate bacterial identification using MS techniques even before culture can be attained in the near future.

  8. Relationships between structure, process and outcome to assess quality of integrated chronic disease management in a rural South African setting: applying a structural equation model.

    PubMed

    Ameh, Soter; Gómez-Olivé, Francesc Xavier; Kahn, Kathleen; Tollman, Stephen M; Klipstein-Grobusch, Kerstin

    2017-03-23

    South Africa faces a complex dual burden of chronic communicable and non-communicable diseases (NCDs). In response, the Integrated Chronic Disease Management (ICDM) model was initiated in primary health care (PHC) facilities in 2011 to leverage the HIV/ART programme to scale-up services for NCDs, achieve optimal patient health outcomes and improve the quality of medical care. However, little is known about the quality of care in the ICDM model. The objectives of this study were to: i) assess patients' and operational managers' satisfaction with the dimensions of ICDM services; and ii) evaluate the quality of care in the ICDM model using Avedis Donabedian's theory of relationships between structure (resources), process (clinical activities) and outcome (desired result of healthcare) constructs as a measure of quality of care. A cross-sectional study was conducted in 2013 in seven PHC facilities in the Bushbuckridge municipality of Mpumalanga Province, north-east South Africa - an area underpinned by a robust Health and Demographic Surveillance System (HDSS). The patient satisfaction questionnaire (PSQ-18), with measures reflecting structure/process/outcome (SPO) constructs, was adapted and administered to 435 chronic disease patients and the operational managers of all seven PHC facilities. The adapted questionnaire contained 17 dimensions of care, including eight dimensions identified as priority areas in the ICDM model - critical drugs, equipment, referral, defaulter tracing, prepacking of medicines, clinic appointments, waiting time, and coherence. A structural equation model was fit to operationalise Donabedian's theory, using unidirectional, mediation, and reciprocal pathways. The mediation pathway showed that the relationships between structure, process and outcome represented quality systems in the ICDM model. Structure correlated with process (0.40) and outcome (0.75). Given structure, process correlated with outcome (0.88). Of the 17 dimensions of care in the ICDM model, three structure (equipment, critical drugs, accessibility), three process (professionalism, friendliness and attendance to patients) and three outcome (competence, confidence and coherence) dimensions reflected their intended constructs. Of the priority dimensions, referrals, defaulter tracing, prepacking of medicines, appointments, and patient waiting time did not reflect their intended constructs. Donabedian's theoretical framework can be used to provide evidence of quality systems in the ICDM model.

  9. 40 CFR 60.1860 - What reports must I submit and in what form?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule... §§ 60.1870, 60.1880, and 60.1895. If the Administrator agrees, you may submit electronic reports. (c...

  10. 40 CFR 60.1860 - What reports must I submit and in what form?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule... §§ 60.1870, 60.1880, and 60.1895. If the Administrator agrees, you may submit electronic reports. (c...

  11. 40 CFR 60.1860 - What reports must I submit and in what form?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule... §§ 60.1870, 60.1880, and 60.1895. If the Administrator agrees, you may submit electronic reports. (c...

  12. 40 CFR 60.1860 - What reports must I submit and in what form?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule... §§ 60.1870, 60.1880, and 60.1895. If the Administrator agrees, you may submit electronic reports. (c...

  13. 40 CFR 60.1860 - What reports must I submit and in what form?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Times for Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule... §§ 60.1870, 60.1880, and 60.1895. If the Administrator agrees, you may submit electronic reports. (c...

  14. Using constructive alignment theory to develop nursing skills curricula.

    PubMed

    Joseph, Sundari; Juwah, Charles

    2012-01-01

    Constructive alignment theory has been used to underpin the development of curricula in higher education for some time (Biggs and Tang, 2007), however, its use to inform and determine skills curricula in nursing is less well documented. This paper explores the use of constructive alignment theory within a study of undergraduate student nurses undertaking clinical skill acquisition in the final year of a BSc (Hons) Nursing course. Students were followed up as newly qualified nurses (NQN) (n = 58) to ascertain the impact of skill acquisition in this way. Comparisons were made with newly qualified nurses who did not participate in a constructively aligned curriculum. This mixed methods study reported skill identification within the immediate post-registration period and evaluated the constructively aligned curriculum as having positive benefits for NQNs in terms of confidence to practice. This was supported by preceptors' views. The study recommends two process models for nursing skills curriculum development and reports that constructive alignment is a useful theoretical framework for nurse educators. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Analysis of loss of time value during road maintenance project

    NASA Astrophysics Data System (ADS)

    Sudarsana, Dewa Ketut; Sanjaya, Putu Ari

    2017-06-01

    Lane closure is frequently performed in the execution of the road maintenance project. It has a negative impact on road users such as the loss of vehicle operating costs and the loss of time value. Nevertheless, analysis on loss of time value in Indonesia has not been carried out. The parameter of time value for the road users was the minimum wage city/region approach. Vehicle speed of pre-construction was obtained by observation, while the speed during the road maintenance project was predicted by the speed of the pre-construction by multiplying it with the speed adjustment factor. In the case of execution of the National road maintenance project in the two-lane two-way urban and interurban road types in the fiscal year of 2015 in Bali province, the loss of time value was at the average of IDR 12,789,000/day/link road. The relationship of traffic volume and loss of time value of the road users was obtained by a logarithm model.

  16. Predicting future forestland area: a comparison of econometric approaches.

    Treesearch

    SoEun Ahn; Andrew J. Plantinga; Ralph J. Alig

    2000-01-01

    Predictions of future forestland area are an important component of forest policy analyses. In this article, we test the ability of econometric land use models to accurately forecast forest area. We construct a panel data set for Alabama consisting of county and time-series observation for the period 1964 to 1992. We estimate models using restricted data sets-namely,...

  17. Planned Missing Designs to Optimize the Efficiency of Latent Growth Parameter Estimates

    ERIC Educational Resources Information Center

    Rhemtulla, Mijke; Jia, Fan; Wu, Wei; Little, Todd D.

    2014-01-01

    We examine the performance of planned missing (PM) designs for correlated latent growth curve models. Using simulated data from a model where latent growth curves are fitted to two constructs over five time points, we apply three kinds of planned missingness. The first is item-level planned missingness using a three-form design at each wave such…

  18. Longitudinal Examination of Procrastination and Anxiety, and Their Relation to Self-Efficacy for Self- Regulated Learning: Latent Growth Curve Modeling

    ERIC Educational Resources Information Center

    Yerdelen, Sündüs; McCaffrey, Adam; Klassen, Robert M.

    2016-01-01

    This study investigated the longitudinal association between students' anxiety and procrastination and the relation of self-efficacy for self-regulation to these constructs. Latent Growth Curve Modeling was used to analyze data gathered from 182 undergraduate students (134 female, 48 male) at 4 times during a semester. Our results showed that…

  19. Work Climate, Organizational Commitment, and Highway Safety in the Trucking Industry: Toward Causal Modeling of Large Truck Crashes

    ERIC Educational Resources Information Center

    Graham, Carroll M.; Scott, Aaron J.; Nafukho, Fredrick M.

    2008-01-01

    While theoretical models aimed at explaining or predicting employee turnover outcomes have been developed, minimal consideration has been given to the same task regarding safety, often measured as the probability of a crash in a given time frame. The present literature review identifies four constructs from turnover literature, which are believed…

  20. An Advanced N -body Model for Interacting Multiple Stellar Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brož, Miroslav

    We construct an advanced model for interacting multiple stellar systems in which we compute all trajectories with a numerical N -body integrator, namely the Bulirsch–Stoer from the SWIFT package. We can then derive various observables: astrometric positions, radial velocities, minima timings (TTVs), eclipse durations, interferometric visibilities, closure phases, synthetic spectra, spectral energy distribution, and even complete light curves. We use a modified version of the Wilson–Devinney code for the latter, in which the instantaneous true phase and inclination of the eclipsing binary are governed by the N -body integration. If all of these types of observations are at one’s disposal,more » a joint χ {sup 2} metric and an optimization algorithm (a simplex or simulated annealing) allow one to search for a global minimum and construct very robust models of stellar systems. At the same time, our N -body model is free from artifacts that may arise if mutual gravitational interactions among all components are not self-consistently accounted for. Finally, we present a number of examples showing dynamical effects that can be studied with our code and we discuss how systematic errors may affect the results (and how to prevent this from happening).« less

  1. Efficient Quantum Pseudorandomness.

    PubMed

    Brandão, Fernando G S L; Harrow, Aram W; Horodecki, Michał

    2016-04-29

    Randomness is both a useful way to model natural systems and a useful tool for engineered systems, e.g., in computation, communication, and control. Fully random transformations require exponential time for either classical or quantum systems, but in many cases pseudorandom operations can emulate certain properties of truly random ones. Indeed, in the classical realm there is by now a well-developed theory regarding such pseudorandom operations. However, the construction of such objects turns out to be much harder in the quantum case. Here, we show that random quantum unitary time evolutions ("circuits") are a powerful source of quantum pseudorandomness. This gives for the first time a polynomial-time construction of quantum unitary designs, which can replace fully random operations in most applications, and shows that generic quantum dynamics cannot be distinguished from truly random processes. We discuss applications of our result to quantum information science, cryptography, and understanding the self-equilibration of closed quantum dynamics.

  2. Principles of Discrete Time Mechanics

    NASA Astrophysics Data System (ADS)

    Jaroszkiewicz, George

    2014-04-01

    1. Introduction; 2. The physics of discreteness; 3. The road to calculus; 4. Temporal discretization; 5. Discrete time dynamics architecture; 6. Some models; 7. Classical cellular automata; 8. The action sum; 9. Worked examples; 10. Lee's approach to discrete time mechanics; 11. Elliptic billiards; 12. The construction of system functions; 13. The classical discrete time oscillator; 14. Type 2 temporal discretization; 15. Intermission; 16. Discrete time quantum mechanics; 17. The quantized discrete time oscillator; 18. Path integrals; 19. Quantum encoding; 20. Discrete time classical field equations; 21. The discrete time Schrodinger equation; 22. The discrete time Klein-Gordon equation; 23. The discrete time Dirac equation; 24. Discrete time Maxwell's equations; 25. The discrete time Skyrme model; 26. Discrete time quantum field theory; 27. Interacting discrete time scalar fields; 28. Space, time and gravitation; 29. Causality and observation; 30. Concluding remarks; Appendix A. Coherent states; Appendix B. The time-dependent oscillator; Appendix C. Quaternions; Appendix D. Quantum registers; References; Index.

  3. A synthetic seismicity model for the Middle America Trench

    NASA Technical Reports Server (NTRS)

    Ward, Steven N.

    1991-01-01

    A novel iterative technique, based on the concept of fault segmentation and computed using 2D static dislocation theory, for building models of seismicity and fault interaction which are physically acceptable and geometrically and kinematically correct, is presented. The technique is applied in two steps to seismicity observed at the Middle America Trench. The first constructs generic models which randomly draw segment strengths and lengths from a 2D probability distribution. The second constructs predictive models in which segment lengths and strengths are adjusted to mimic the actual geography and timing of large historical earthquakes. Both types of models reproduce the statistics of seismicity over five units of magnitude and duplicate other aspects including foreshock and aftershock sequences, migration of foci, and the capacity to produce both characteristic and noncharacteristic earthquakes. Over a period of about 150 yr the complex interaction of fault segments and the nonlinear failure conditions conspire to transform an apparently deterministic model into a chaotic one.

  4. A New Minimum Trees-Based Approach for Shape Matching with Improved Time Computing: Application to Graphical Symbols Recognition

    NASA Astrophysics Data System (ADS)

    Franco, Patrick; Ogier, Jean-Marc; Loonis, Pierre; Mullot, Rémy

    Recently we have developed a model for shape description and matching. Based on minimum spanning trees construction and specifics stages like the mixture, it seems to have many desirable properties. Recognition invariance in front shift, rotated and noisy shape was checked through median scale tests related to GREC symbol reference database. Even if extracting the topology of a shape by mapping the shortest path connecting all the pixels seems to be powerful, the construction of graph induces an expensive algorithmic cost. In this article we discuss on the ways to reduce time computing. An alternative solution based on image compression concepts is provided and evaluated. The model no longer operates in the image space but in a compact space, namely the Discrete Cosine space. The use of block discrete cosine transform is discussed and justified. The experimental results led on the GREC2003 database show that the proposed method is characterized by a good discrimination power, a real robustness to noise with an acceptable time computing.

  5. Using Trust to Establish a Secure Routing Model in Cognitive Radio Network.

    PubMed

    Zhang, Guanghua; Chen, Zhenguo; Tian, Liqin; Zhang, Dongwen

    2015-01-01

    Specific to the selective forwarding attack on routing in cognitive radio network, this paper proposes a trust-based secure routing model. Through monitoring nodes' forwarding behaviors, trusts of nodes are constructed to identify malicious nodes. In consideration of that routing selection-based model must be closely collaborative with spectrum allocation, a route request piggybacking available spectrum opportunities is sent to non-malicious nodes. In the routing decision phase, nodes' trusts are used to construct available path trusts and delay measurement is combined for making routing decisions. At the same time, according to the trust classification, different responses are made specific to their service requests. By adopting stricter punishment on malicious behaviors from non-trusted nodes, the cooperation of nodes in routing can be stimulated. Simulation results and analysis indicate that this model has good performance in network throughput and end-to-end delay under the selective forwarding attack.

  6. Model of two infectious diseases in nettle caterpillar population

    NASA Astrophysics Data System (ADS)

    Firdausi, F. Z.; Nuraini, N.

    2016-04-01

    Palm oil is a vital commodity to the economy of Indonesia. The area of oil palm plantations in Indonesia has increased from year to year. However, the effectiveness of palm oil production is reduced by pest infestation. One of the pest which often infests oil palm plantations is nettle caterpillar. The pest control used in this study is biological control, viz. biological agents given to oil palm trees. This paper describes a mathematical model of two infectious diseases in nettle caterpillar population. The two infectious diseases arise due to two biological agents, namely Bacillus thuringiensis bacterium and parasite which usually attack nettle caterpillars. The derivation of the model constructed in this paper is obtained from ordinary differential equations without time delay. The equilibrium points are analyzed. Two of three equilibrium points are stable if the Routh-Hurwitz criteria are fulfilled. In addition, this paper also presents the numerical simulation of the model which has been constructed.

  7. Object-Oriented Technology-Based Software Library for Operations of Water Reclamation Centers

    NASA Astrophysics Data System (ADS)

    Otani, Tetsuo; Shimada, Takehiro; Yoshida, Norio; Abe, Wataru

    SCADA systems in water reclamation centers have been constructed based on hardware and software that each manufacturer produced according to their design. Even though this approach used to be effective to realize real-time and reliable execution, it is an obstacle to cost reduction about system construction and maintenance. A promising solution to address the problem is to set specifications that can be used commonly. In terms of software, information model approach has been adopted in SCADA systems in other field, such as telecommunications and power systems. An information model is a piece of software specification that describes a physical or logical object to be monitored. In this paper, we propose information models for operations of water reclamation centers, which have not ever existed. In addition, we show the feasibility of the information model in terms of common use and processing performance.

  8. Input-output identification of controlled discrete manufacturing systems

    NASA Astrophysics Data System (ADS)

    Estrada-Vargas, Ana Paula; López-Mellado, Ernesto; Lesage, Jean-Jacques

    2014-03-01

    The automated construction of discrete event models from observations of external system's behaviour is addressed. This problem, often referred to as system identification, allows obtaining models of ill-known (or even unknown) systems. In this article, an identification method for discrete event systems (DESs) controlled by a programmable logic controller is presented. The method allows processing a large quantity of observed long sequences of input/output signals generated by the controller and yields an interpreted Petri net model describing the closed-loop behaviour of the automated DESs. The proposed technique allows the identification of actual complex systems because it is sufficiently efficient and well adapted to cope with both the technological characteristics of industrial controllers and data collection requirements. Based on polynomial-time algorithms, the method is implemented as an efficient software tool which constructs and draws the model automatically; an overview of this tool is given through a case study dealing with an automated manufacturing system.

  9. Study on dynamic performance of SOFC

    NASA Astrophysics Data System (ADS)

    Zhan, Haiyang; Liang, Qianchao; Wen, Qiang; Zhu, Runkai

    2017-05-01

    In order to solve the problem of real-time matching of load and fuel cell power, it is urgent to study the dynamic response process of SOFC in the case of load mutation. The mathematical model of SOFC is constructed, and its performance is simulated. The model consider the influence factors such as polarization effect, ohmic loss. It also takes the diffusion effect, thermal effect, energy exchange, mass conservation, momentum conservation. One dimensional dynamic mathematical model of SOFC is constructed by using distributed lumped parameter method. The simulation results show that the I-V characteristic curves are in good agreement with the experimental data, and the accuracy of the model is verified. The voltage response curve, power response curve and the efficiency curve are obtained by this way. It lays a solid foundation for the research of dynamic performance and optimal control in power generation system of high power fuel cell stack.

  10. Superior memory efficiency of quantum devices for the simulation of continuous-time stochastic processes

    NASA Astrophysics Data System (ADS)

    Elliott, Thomas J.; Gu, Mile

    2018-03-01

    Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.

  11. Construction of a three-dimensional interactive model of the skull base and cranial nerves.

    PubMed

    Kakizawa, Yukinari; Hongo, Kazuhiro; Rhoton, Albert L

    2007-05-01

    The goal was to develop an interactive three-dimensional (3-D) computerized anatomic model of the skull base for teaching microneurosurgical anatomy and for operative planning. The 3-D model was constructed using commercially available software (Maya 6.0 Unlimited; Alias Systems Corp., Delaware, MD), a personal computer, four cranial specimens, and six dry bones. Photographs from at least two angles of the superior and lateral views were imported to the 3-D software. Many photographs were needed to produce the model in anatomically complex areas. Careful dissection was needed to expose important structures in the two views. Landmarks, including foramen, bone, and dura mater, were used as reference points. The 3-D model of the skull base and related structures was constructed using more than 300,000 remodeled polygons. The model can be viewed from any angle. It can be rotated 360 degrees in any plane using any structure as the focal point of rotation. The model can be reduced or enlarged using the zoom function. Variable transparencies could be assigned to any structures so that the structures at any level can be seen. Anatomic labels can be attached to the structures in the 3-D model for educational purposes. This computer-generated 3-D model can be observed and studied repeatedly without the time limitations and stresses imposed by surgery. This model may offer the potential to create interactive surgical exercises useful in evaluating multiple surgical routes to specific target areas in the skull base.

  12. The practical use of simplicity in developing ground water models

    USGS Publications Warehouse

    Hill, M.C.

    2006-01-01

    The advantages of starting with simple models and building complexity slowly can be significant in the development of ground water models. In many circumstances, simpler models are characterized by fewer defined parameters and shorter execution times. In this work, the number of parameters is used as the primary measure of simplicity and complexity; the advantages of shorter execution times also are considered. The ideas are presented in the context of constructing ground water models but are applicable to many fields. Simplicity first is put in perspective as part of the entire modeling process using 14 guidelines for effective model calibration. It is noted that neither very simple nor very complex models generally produce the most accurate predictions and that determining the appropriate level of complexity is an ill-defined process. It is suggested that a thorough evaluation of observation errors is essential to model development. Finally, specific ways are discussed to design useful ground water models that have fewer parameters and shorter execution times.

  13. Empirical Investigation of Critical Transitions in Paleoclimate

    NASA Astrophysics Data System (ADS)

    Loskutov, E. M.; Mukhin, D.; Gavrilov, A.; Feigin, A.

    2016-12-01

    In this work we apply a new empirical method for the analysis of complex spatially distributed systems to the analysis of paleoclimate data. The method consists of two general parts: (i) revealing the optimal phase-space variables and (ii) construction the empirical prognostic model by observed time series. The method of phase space variables construction based on the data decomposition into nonlinear dynamical modes which was successfully applied to global SST field and allowed clearly separate time scales and reveal climate shift in the observed data interval [1]. The second part, the Bayesian approach to optimal evolution operator reconstruction by time series is based on representation of evolution operator in the form of nonlinear stochastic function represented by artificial neural networks [2,3]. In this work we are focused on the investigation of critical transitions - the abrupt changes in climate dynamics - in match longer time scale process. It is well known that there were number of critical transitions on different time scales in the past. In this work, we demonstrate the first results of applying our empirical methods to analysis of paleoclimate variability. In particular, we discuss the possibility of detecting, identifying and prediction such critical transitions by means of nonlinear empirical modeling using the paleoclimate record time series. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep155102. Ya. I. Molkov, D. N. Mukhin, E. M. Loskutov, A.M. Feigin, (2012) : Random dynamical models from time series. Phys. Rev. E, Vol. 85, n.3.3. Mukhin, D., Kondrashov, D., Loskutov, E., Gavrilov, A., Feigin, A., & Ghil, M. (2015). Predicting Critical Transitions in ENSO models. Part II: Spatially Dependent Models. Journal of Climate, 28(5), 1962-1976. http://doi.org/10.1175/JCLI-D-14-00240.1

  14. Modelling of a mecanum wheel taking into account the geometry of road rollers

    NASA Astrophysics Data System (ADS)

    Hryniewicz, P.; Gwiazda, A.; Banaś, W.; Sękala, A.; Foit, K.

    2017-08-01

    During the process planning in a company one of the basic factors associated with the production costs is the operation time for particular technological jobs. The operation time consists of time units associated with the machining tasks of a workpiece as well as the time associated with loading and unloading and the transport operations of this workpiece between machining stands. Full automation of manufacturing in industry companies tends to a maximal reduction in machine downtimes, thereby the fixed costs simultaneously decreasing. The new construction of wheeled vehicles, using Mecanum wheels, reduces the transport time of materials and workpieces between machining stands. These vehicles have the ability to simultaneously move in two axes and thus more rapid positioning of the vehicle relative to the machining stand. The Mecanum wheel construction implies placing, around the wheel free rollers that are mounted at an angle 450, which allow the movement of the vehicle not only in its axis but also perpendicular thereto. The improper selection of the rollers can cause unwanted vertical movement of the vehicle, which may cause difficulty in positioning of the vehicle in relation to the machining stand and the need for stabilisation. Hence the proper design of the free rollers is essential in designing the whole Mecanum wheel construction. It allows avoiding the disadvantageous and unwanted vertical vibrations of a whole vehicle with these wheels. In the article the process of modelling the free rollers, in order to obtain the desired shape of unchanging, horizontal trajectory of the vehicle is presented. This shape depends on the desired diameter of the whole Mecanum wheel, together with the road rollers, and the width of the drive wheel. Another factor related with the curvature of the trajectory shape is the length of the road roller and its diameter decreases depending on the position with respect to its centre. The additional factor, limiting construction of the road rollers, is their bearings. Depending on the load, carried by the vehicle and the rotational speed of the drive wheel, the bearings themselves can greatly affect the diameter of the rollers and the whole Mecanum wheels. The solution of this problem is presented in the paper. It is illustrated with virtual models elaborated in advanced program of the CAE class.

  15. A Filter-Mediated Communication Model for Design Collaboration in Building Construction

    PubMed Central

    Oh, Minho

    2014-01-01

    Multidisciplinary collaboration is an important aspect of modern engineering activities, arising from the growing complexity of artifacts whose design and construction require knowledge and skills that exceed the capacities of any one professional. However, current collaboration in the architecture, engineering, and construction industries often fails due to lack of shared understanding between different participants and limitations of their supporting tools. To achieve a high level of shared understanding, this study proposes a filter-mediated communication model. In the proposed model, participants retain their own data in the form most appropriate for their needs with domain-specific filters that transform the neutral representations into semantically rich ones, as needed by the participants. Conversely, the filters can translate semantically rich, domain-specific data into a neutral representation that can be accessed by other domain-specific filters. To validate the feasibility of the proposed model, we computationally implement the filter mechanism and apply it to a hypothetical test case. The result acknowledges that the filter mechanism can let the participants know ahead of time what will be the implications of their proposed actions, as seen from other participants' points of view. PMID:25309958

  16. Co-production and time use. Influence on product evaluation.

    PubMed

    Heide, Morten; Olsen, Svein Ottar

    2011-02-01

    This study analyses how time use influences consumers' evaluation of a product and their satisfaction with the co-production activity. It also includes hypotheses about how knowledge and perceived convenience are related to the evaluative constructs. The constructs are checked for reliability and validity, before using structural equation modelling in Lisrel to estimate the relationships between the constructs and their measures. The results showed that time use had a negative influence on perceived convenience and a positive effect on satisfaction with co-production, but did not influence the global evaluation of the product. Satisfaction with co-production and perceived convenience had a positive influence on the global evaluation. Knowledge had a negative influence on time use. Finally, knowledge and perceived convenience had a positive relationship with satisfaction with co-production. In total, seven out of nine hypotheses are supported by the data. The study suggests that time use, perceived convenience, and satisfaction with co-production can be an important variables in the understanding of the evaluative outcome of a co-produced product. The dual role of time use can be positioned as something the consumer wants to minimize for convenient reasons or extended in order to be satisfied with the co-production effort. The paper presents new insights into how co-production and time use influence product evaluation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. Risk factors for injury among construction workers at Denver International Airport.

    PubMed

    Lowery, J T; Borgerding, J A; Zhen, B; Glazner, J E; Bondy, J; Kreiss, K

    1998-08-01

    The Denver International Airport construction project provided a rare opportunity to identify risk factors for injury on a large construction project for which 769 contractors were hired to complete 2,843 construction contracts. Workers' compensation claims and payroll data for individual contracts were recorded in an administrative database developed by the project's Owner-Controlled Insurance Program. From claims andy payroll data linked with employee demographic information, we calculated injury rates per 200,000 person-hours by contract and over contract characteristics of interest. We used Poisson regression models to examine contract-specific risk factors in relation to total injuries, lost-work-time (LWT), and non-LWT injuries. We included contract-specific expected loss rates (ELRs) in the model to control for prevailing risk of work and used logistic regression methods to determine the association between LWT and non-LWT injuries on contracts. Injury rates were highest during the first year of construction, at the beginning of contracts, and among older workers. Risk for total and non-LWT injuries was elevated for building construction contracts, contract for special trades companies (SIC 17), contracts with payrolls over $1 million, and those with overtime payrolls greater than 20%. Risk for LWT injuries only was increased for site development contracts and contract starting in the first year of construction. Contracts experiencing one or more minor injuries were four times as likely to have at least one major injury (OR = 4.0, 95% CI (2.9, 5.5)). Enhancement of DIA's safety infrastructure during the second year of construction appears to have been effective in reducing serious (LWT) injures. The absence of correlation between injury rates among contracts belonging to the same company suggest that targeting of safety resources at the level of the contract may be an effective approach to injury prevention. Interventions focused on high-risk contracts, including those with considerable overtime work, contracts held by special trades contractors (SIC 17), and contracts belonging to small and mid-sized companies, and on high-risk workers, such as those new to a construction site or new to a contract may reduce injury burden on large construction sites. The join occurrence of minor and major injuries on a contract level suggests that surveillance of minor injuries may be useful in identifying opportunities for prevention of major injures.

  18. A memory structure adapted simulated annealing algorithm for a green vehicle routing problem.

    PubMed

    Küçükoğlu, İlker; Ene, Seval; Aksoy, Aslı; Öztürk, Nursel

    2015-03-01

    Currently, reduction of carbon dioxide (CO2) emissions and fuel consumption has become a critical environmental problem and has attracted the attention of both academia and the industrial sector. Government regulations and customer demands are making environmental responsibility an increasingly important factor in overall supply chain operations. Within these operations, transportation has the most hazardous effects on the environment, i.e., CO2 emissions, fuel consumption, noise and toxic effects on the ecosystem. This study aims to construct vehicle routes with time windows that minimize the total fuel consumption and CO2 emissions. The green vehicle routing problem with time windows (G-VRPTW) is formulated using a mixed integer linear programming model. A memory structure adapted simulated annealing (MSA-SA) meta-heuristic algorithm is constructed due to the high complexity of the proposed problem and long solution times for practical applications. The proposed models are integrated with a fuel consumption and CO2 emissions calculation algorithm that considers the vehicle technical specifications, vehicle load, and transportation distance in a green supply chain environment. The proposed models are validated using well-known instances with different numbers of customers. The computational results indicate that the MSA-SA heuristic is capable of obtaining good G-VRPTW solutions within a reasonable amount of time by providing reductions in fuel consumption and CO2 emissions.

  19. Distributed Finite-Time Cooperative Control of Multiple High-Order Nonholonomic Mobile Robots.

    PubMed

    Du, Haibo; Wen, Guanghui; Cheng, Yingying; He, Yigang; Jia, Ruting

    2017-12-01

    The consensus problem of multiple nonholonomic mobile robots in the form of high-order chained structure is considered in this paper. Based on the model features and the finite-time control technique, a finite-time cooperative controller is explicitly constructed which guarantees that the states consensus is achieved in a finite time. As an application of the proposed results, finite-time formation control of multiple wheeled mobile robots is studied and a finite-time formation control algorithm is proposed. To show effectiveness of the proposed approach, a simulation example is given.

  20. LOGAM (Logistic Analysis Model). Volume 2. Users Manual.

    DTIC Science & Technology

    1982-08-01

    as opposed to simulation models which represent a system’s behavior as a function of time. These latter classes of models are often complex. They...includes the cost of ammunition and missiles comsumed by the system being costed during unit training. Excluded is the cost of ammunition consumed during...data. The results obtained from sensitivity testing may be used to construct graphs which display the behavior of the maintenance concept over the range

  1. Full-wave Moment Tensor and Tomographic Inversions Based on 3D Strain Green Tensor

    DTIC Science & Technology

    2010-01-31

    propagation in three-dimensional (3D) earth, linearizes the inverse problem by iteratively updating the earth model , and provides an accurate way to...self-consistent FD-SGT databases constructed from finite-difference simulations of wave propagation in full-wave tomographic models can be used to...determine the moment tensors within minutes after a seismic event, making it possible for real time monitoring using 3D models . 15. SUBJECT TERMS

  2. Observational constraints on holographic tachyonic dark energy in interaction with dark matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Micheletti, Sandro M. R., E-mail: smrm@fma.if.usp.br

    2010-05-01

    We discuss an interacting tachyonic dark energy model in the context of the holographic principle. The potential of the holographic tachyon field in interaction with dark matter is constructed. The model results are compared with CMB shift parameter, baryonic acoustic oscilations, lookback time and the Constitution supernovae sample. The coupling constant of the model is compatible with zero, but dark energy is not given by a cosmological constant.

  3. From standard alpha-stable Lévy motions to horizontal visibility networks: dependence of multifractal and Laplacian spectrum

    NASA Astrophysics Data System (ADS)

    Zou, Hai-Long; Yu, Zu-Guo; Anh, Vo; Ma, Yuan-Lin

    2018-05-01

    In recent years, researchers have proposed several methods to transform time series (such as those of fractional Brownian motion) into complex networks. In this paper, we construct horizontal visibility networks (HVNs) based on the -stable Lévy motion. We aim to study the relations of multifractal and Laplacian spectrum of transformed networks on the parameters and of the -stable Lévy motion. First, we employ the sandbox algorithm to compute the mass exponents and multifractal spectrum to investigate the multifractality of these HVNs. Then we perform least squares fits to find possible relations of the average fractal dimension , the average information dimension and the average correlation dimension against using several methods of model selection. We also investigate possible dependence relations of eigenvalues and energy on , calculated from the Laplacian and normalized Laplacian operators of the constructed HVNs. All of these constructions and estimates will help us to evaluate the validity and usefulness of the mappings between time series and networks, especially between time series of -stable Lévy motions and HVNs.

  4. Warped product space-times

    NASA Astrophysics Data System (ADS)

    An, Xinliang; Wong, Willie Wai Yeung

    2018-01-01

    Many classical results in relativity theory concerning spherically symmetric space-times have easy generalizations to warped product space-times, with a two-dimensional Lorentzian base and arbitrary dimensional Riemannian fibers. We first give a systematic presentation of the main geometric constructions, with emphasis on the Kodama vector field and the Hawking energy; the construction is signature independent. This leads to proofs of general Birkhoff-type theorems for warped product manifolds; our theorems in particular apply to situations where the warped product manifold is not necessarily Einstein, and thus can be applied to solutions with matter content in general relativity. Next we specialize to the Lorentzian case and study the propagation of null expansions under the assumption of the dominant energy condition. We prove several non-existence results relating to the Yamabe class of the fibers, in the spirit of the black-hole topology theorem of Hawking–Galloway–Schoen. Finally we discuss the effect of the warped product ansatz on matter models. In particular we construct several cosmological solutions to the Einstein–Euler equations whose spatial geometry is generally not isotropic.

  5. Using SAS PROC CALIS to fit Level-1 error covariance structures of latent growth models.

    PubMed

    Ding, Cherng G; Jane, Ten-Der

    2012-09-01

    In the present article, we demonstrates the use of SAS PROC CALIS to fit various types of Level-1 error covariance structures of latent growth models (LGM). Advantages of the SEM approach, on which PROC CALIS is based, include the capabilities of modeling the change over time for latent constructs, measured by multiple indicators; embedding LGM into a larger latent variable model; incorporating measurement models for latent predictors; and better assessing model fit and the flexibility in specifying error covariance structures. The strength of PROC CALIS is always accompanied with technical coding work, which needs to be specifically addressed. We provide a tutorial on the SAS syntax for modeling the growth of a manifest variable and the growth of a latent construct, focusing the documentation on the specification of Level-1 error covariance structures. Illustrations are conducted with the data generated from two given latent growth models. The coding provided is helpful when the growth model has been well determined and the Level-1 error covariance structure is to be identified.

  6. Dynamic modeling of potentially conflicting energy reduction strategies for residential structures in semi-arid climates.

    PubMed

    Hester, Nathan; Li, Ke; Schramski, John R; Crittenden, John

    2012-04-30

    Globally, residential energy consumption continues to rise due to a variety of trends such as increasing access to modern appliances, overall population growth, and the overall increase of electricity distribution. Currently, residential energy consumption accounts for approximately one-fifth of total U.S. energy consumption. This research analyzes the effectiveness of a range of energy-saving measures for residential houses in semi-arid climates. These energy-saving measures include: structural insulated panels (SIP) for exterior wall construction, daylight control, increased window area, efficient window glass suitable for the local weather, and several combinations of these. Our model determined that energy consumption is reduced by up to 6.1% when multiple energy savings technologies are combined. In addition, pre-construction technologies (structural insulated panels (SIPs), daylight control, and increased window area) provide roughly 4 times the energy savings when compared to post-construction technologies (window blinds and efficient window glass). The model also illuminated the importance variations in local climate and building configuration; highlighting the site-specific nature of this type of energy consumption quantification for policy and building code considerations. Published by Elsevier Ltd.

  7. Microscale Characterization of the Viscoelastic Properties of Hydrogel Biomaterials using Dual-Mode Ultrasound Elastography

    PubMed Central

    Hong, Xiaowei; Stegemann, Jan P.; Deng, Cheri X.

    2016-01-01

    Characterization of the microscale mechanical properties of biomaterials is a key challenge in the field of mechanobiology. Dual-mode ultrasound elastography (DUE) uses high frequency focused ultrasound to induce compression in a sample, combined with interleaved ultrasound imaging to measure the resulting deformation. This technique can be used to non-invasively perform creep testing on hydrogel biomaterials to characterize their viscoelastic properties. DUE was applied to a range of hydrogel constructs consisting of either hydroxyapatite (HA)-doped agarose, HA-collagen, HA-fibrin, or preosteoblast-seeded collagen constructs. DUE provided spatial and temporal mapping of local and bulk displacements and strains at high resolution. Hydrogel materials exhibited characteristic creep behavior, and the maximum strain and residual strain were both material- and concentration-dependent. Burger’s viscoelastic model was used to extract characteristic parameters describing material behavior. Increased protein concentration resulted in greater stiffness and viscosity, but did not affect the viscoelastic time constant of acellular constructs. Collagen constructs exhibited significantly higher modulus and viscosity than fibrin constructs. Cell-seeded collagen constructs became stiffer with altered mechanical behavior as they developed over time. Importantly, DUE also provides insight into the spatial variation of viscoelastic properties at sub-millimeter resolution, allowing interrogation of the interior of constructs. DUE presents a novel technique for non-invasively characterizing hydrogel materials at the microscale, and therefore may have unique utility in the study of mechanobiology and the characterization of hydrogel biomaterials. PMID:26928595

  8. Microscale characterization of the viscoelastic properties of hydrogel biomaterials using dual-mode ultrasound elastography.

    PubMed

    Hong, Xiaowei; Stegemann, Jan P; Deng, Cheri X

    2016-05-01

    Characterization of the microscale mechanical properties of biomaterials is a key challenge in the field of mechanobiology. Dual-mode ultrasound elastography (DUE) uses high frequency focused ultrasound to induce compression in a sample, combined with interleaved ultrasound imaging to measure the resulting deformation. This technique can be used to non-invasively perform creep testing on hydrogel biomaterials to characterize their viscoelastic properties. DUE was applied to a range of hydrogel constructs consisting of either hydroxyapatite (HA)-doped agarose, HA-collagen, HA-fibrin, or preosteoblast-seeded collagen constructs. DUE provided spatial and temporal mapping of local and bulk displacements and strains at high resolution. Hydrogel materials exhibited characteristic creep behavior, and the maximum strain and residual strain were both material- and concentration-dependent. Burger's viscoelastic model was used to extract characteristic parameters describing material behavior. Increased protein concentration resulted in greater stiffness and viscosity, but did not affect the viscoelastic time constant of acellular constructs. Collagen constructs exhibited significantly higher modulus and viscosity than fibrin constructs. Cell-seeded collagen constructs became stiffer with altered mechanical behavior as they developed over time. Importantly, DUE also provides insight into the spatial variation of viscoelastic properties at sub-millimeter resolution, allowing interrogation of the interior of constructs. DUE presents a novel technique for non-invasively characterizing hydrogel materials at the microscale, and therefore may have unique utility in the study of mechanobiology and the characterization of hydrogel biomaterials. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Neural integration underlying a time-compensated sun compass in the migratory monarch butterfly

    PubMed Central

    Shlizerman, Eli; Phillips-Portillo, James; Reppert, Steven M.

    2016-01-01

    Migrating Eastern North American monarch butterflies use a time-compensated sun compass to adjust their flight to the southwest direction. While the antennal genetic circadian clock and the azimuth of the sun are instrumental for proper function of the compass, it is unclear how these signals are represented on a neuronal level and how they are integrated to produce flight control. To address these questions, we constructed a receptive field model of the compound eye that encodes the solar azimuth. We then derived a neural circuit model, which integrates azimuthal and circadian signals to correct flight direction. The model demonstrates an integration mechanism, which produces robust trajectories reaching the southwest regardless of the time of day and includes a configuration for remigration. Comparison of model simulations with flight trajectories of butterflies in a flight simulator shows analogous behaviors and affirms the prediction that midday is the optimal time for migratory flight. PMID:27149852

  10. The Use of Cyclone Modeling in the Erection of Precast Segmental Aerial Construction.

    DTIC Science & Technology

    1983-06-01

    activity or completion of a work cycle within an activity. Marvin E. Mundel outlines the three methods31 as: 1. CONTINUOUS TIMING In continouous timing...technique described by Mundel and outlined in Chapter II in which actual times are recorded and durations for activities obtained later by suc- cessive...8. 28. Thomas, p. 264. 29. Thomas and Holland, p. 520. 30. Thomas and Holland, p. 521. 31. Marvin E. Mundel , Motion and Time Study, Improving

  11. Social Models Enhance Apes' Memory for Novel Events.

    PubMed

    Howard, Lauren H; Wagner, Katherine E; Woodward, Amanda L; Ross, Stephen R; Hopper, Lydia M

    2017-01-20

    Nonhuman primates are more likely to learn from the actions of a social model than a non-social "ghost display", however the mechanism underlying this effect is still unknown. One possibility is that live models are more engaging, drawing increased attention to social stimuli. However, recent research with humans has suggested that live models fundamentally alter memory, not low-level attention. In the current study, we developed a novel eye-tracking paradigm to disentangle the influence of social context on attention and memory in apes. Tested in two conditions, zoo-housed apes (2 gorillas, 5 chimpanzees) were familiarized to videos of a human hand (social condition) and mechanical claw (non-social condition) constructing a three-block tower. During the memory test, subjects viewed side-by-side pictures of the previously-constructed block tower and a novel block tower. In accordance with looking-time paradigms, increased looking time to the novel block tower was used to measure event memory. Apes evidenced memory for the event featuring a social model, though not for the non-social condition. This effect was not dependent on attention differences to the videos. These findings provide the first evidence that, like humans, social stimuli increase nonhuman primates' event memory, which may aid in information transmission via social learning.

  12. Social Models Enhance Apes’ Memory for Novel Events

    PubMed Central

    Howard, Lauren H.; Wagner, Katherine E.; Woodward, Amanda L.; Ross, Stephen R.; Hopper, Lydia M.

    2017-01-01

    Nonhuman primates are more likely to learn from the actions of a social model than a non-social “ghost display”, however the mechanism underlying this effect is still unknown. One possibility is that live models are more engaging, drawing increased attention to social stimuli. However, recent research with humans has suggested that live models fundamentally alter memory, not low-level attention. In the current study, we developed a novel eye-tracking paradigm to disentangle the influence of social context on attention and memory in apes. Tested in two conditions, zoo-housed apes (2 gorillas, 5 chimpanzees) were familiarized to videos of a human hand (social condition) and mechanical claw (non-social condition) constructing a three-block tower. During the memory test, subjects viewed side-by-side pictures of the previously-constructed block tower and a novel block tower. In accordance with looking-time paradigms, increased looking time to the novel block tower was used to measure event memory. Apes evidenced memory for the event featuring a social model, though not for the non-social condition. This effect was not dependent on attention differences to the videos. These findings provide the first evidence that, like humans, social stimuli increase nonhuman primates’ event memory, which may aid in information transmission via social learning. PMID:28106098

  13. Space can substitute for time in predicting climate-change effects on biodiversity.

    PubMed

    Blois, Jessica L; Williams, John W; Fitzpatrick, Matthew C; Jackson, Stephen T; Ferrier, Simon

    2013-06-04

    "Space-for-time" substitution is widely used in biodiversity modeling to infer past or future trajectories of ecological systems from contemporary spatial patterns. However, the foundational assumption--that drivers of spatial gradients of species composition also drive temporal changes in diversity--rarely is tested. Here, we empirically test the space-for-time assumption by constructing orthogonal datasets of compositional turnover of plant taxa and climatic dissimilarity through time and across space from Late Quaternary pollen records in eastern North America, then modeling climate-driven compositional turnover. Predictions relying on space-for-time substitution were ∼72% as accurate as "time-for-time" predictions. However, space-for-time substitution performed poorly during the Holocene when temporal variation in climate was small relative to spatial variation and required subsampling to match the extent of spatial and temporal climatic gradients. Despite this caution, our results generally support the judicious use of space-for-time substitution in modeling community responses to climate change.

  14. Construction of moment-matching multinomial lattices using Vandermonde matrices and Gröbner bases

    NASA Astrophysics Data System (ADS)

    Lundengârd, Karl; Ogutu, Carolyne; Silvestrov, Sergei; Ni, Ying; Weke, Patrick

    2017-01-01

    In order to describe and analyze the quantitative behavior of stochastic processes, such as the process followed by a financial asset, various discretization methods are used. One such set of methods are lattice models where a time interval is divided into equal time steps and the rate of change for the process is restricted to a particular set of values in each time step. The well-known binomial- and trinomial models are the most commonly used in applications, although several kinds of higher order models have also been examined. Here we will examine various ways of designing higher order lattice schemes with different node placements in order to guarantee moment-matching with the process.

  15. Future mission studies: Forecasting solar flux directly from its chaotic time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.

    1991-01-01

    The mathematical structure of the programs written to construct a nonlinear predictive model to forecast solar flux directly from its time series without reference to any underlying solar physics is presented. This method and the programs are written so that one could apply the same technique to forecast other chaotic time series, such as geomagnetic data, attitude and orbit data, and even financial indexes and stock market data. Perhaps the most important application of this technique to flight dynamics is to model Goddard Trajectory Determination System (GTDS) output of residues between observed position of spacecraft and calculated position with no drag (drag flag = off). This would result in a new model of drag working directly from observed data.

  16. An Assessment of Global Organic Carbon Flux Along Continental Margins

    NASA Technical Reports Server (NTRS)

    Thunell, Robert

    2004-01-01

    This project was designed to use real-time and historical SeaWiFS and AVHRR data, and real-time MODIS data in order to estimate the global vertical carbon flux along continental margins. This required construction of an empirical model relating surface ocean color and physical variables like temperature and wind to vertical settling flux at sites co-located with sediment trap observations (Santa Barbara Basin, Cariaco Basin, Gulf of California, Hawaii, and Bermuda, etc), and application of the model to imagery in order to obtain spatially-weighted estimates.

  17. Wind Energy System Time-domain (WEST) analyzers using hybrid simulation techniques

    NASA Technical Reports Server (NTRS)

    Hoffman, J. A.

    1979-01-01

    Two stand-alone analyzers constructed for real time simulation of the complex dynamic characteristics of horizontal-axis wind energy systems are described. Mathematical models for an aeroelastic rotor, including nonlinear aerodynamic and elastic loads, are implemented with high speed digital and analog circuitry. Models for elastic supports, a power train, a control system, and a rotor gimbal system are also included. Limited correlation efforts show good comparisons between results produced by the analyzers and results produced by a large digital simulation. The digital simulation results correlate well with test data.

  18. Continuous Time Random Walk and Migration-Proliferation Dichotomy of Brain Cancer

    NASA Astrophysics Data System (ADS)

    Iomin, A.

    A theory of fractional kinetics of glial cancer cells is presented. A role of the migration-proliferation dichotomy in the fractional cancer cell dynamics in the outer-invasive zone is discussed and explained in the framework of a continuous time random walk. The main suggested model is based on a construction of a 3D comb model, where the migration-proliferation dichotomy becomes naturally apparent and the outer-invasive zone of glioma cancer is considered as a fractal composite with a fractal dimension Dfr < 3.

  19. A Time-constrained Network Voronoi Construction and Accessibility Analysis in Location-based Service Technology

    NASA Astrophysics Data System (ADS)

    Yu, W.; Ai, T.

    2014-11-01

    Accessibility analysis usually requires special models of spatial location analysis based on some geometric constructions, such as Voronoi diagram (abbreviated to VD). There are many achievements in classic Voronoi model research, however suffering from the following limitations for location-based services (LBS) applications. (1) It is difficult to objectively reflect the actual service areas of facilities by using traditional planar VDs, because human activities in LBS are usually constrained only to the network portion of the planar space. (2) Although some researchers have adopted network distance to construct VDs, their approaches are used in a static environment, where unrealistic measures of shortest path distance based on assumptions about constant travel speeds through the network were often used. (3) Due to the computational complexity of the shortest-path distance calculating, previous researches tend to be very time consuming, especially for large datasets and if multiple runs are required. To solve the above problems, a novel algorithm is developed in this paper. We apply network-based quadrat system and 1-D sequential expansion to find the corresponding subnetwork for each focus. The idea is inspired by the natural phenomenon that water flow extends along certain linear channels until meets others or arrives at the end of route. In order to accommodate the changes in traffic conditions, the length of network-quadrat is set upon the traffic condition of the corresponding street. The method has the advantage over Dijkstra's algorithm in that the time cost is avoided, and replaced with a linear time operation.

  20. Improved representation of situational awareness within a dismounted small combat unit constructive simulation

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Colony, Mike

    2011-06-01

    Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.

  1. Construction of pore network models for Berea and Fontainebleau sandstones using non-linear programing and optimization techniques

    NASA Astrophysics Data System (ADS)

    Sharqawy, Mostafa H.

    2016-12-01

    Pore network models (PNM) of Berea and Fontainebleau sandstones were constructed using nonlinear programming (NLP) and optimization methods. The constructed PNMs are considered as a digital representation of the rock samples which were based on matching the macroscopic properties of the porous media and used to conduct fluid transport simulations including single and two-phase flow. The PNMs consisted of cubic networks of randomly distributed pores and throats sizes and with various connectivity levels. The networks were optimized such that the upper and lower bounds of the pore sizes are determined using the capillary tube bundle model and the Nelder-Mead method instead of guessing them, which reduces the optimization computational time significantly. An open-source PNM framework was employed to conduct transport and percolation simulations such as invasion percolation and Darcian flow. The PNM model was subsequently used to compute the macroscopic properties; porosity, absolute permeability, specific surface area, breakthrough capillary pressure, and primary drainage curve. The pore networks were optimized to allow for the simulation results of the macroscopic properties to be in excellent agreement with the experimental measurements. This study demonstrates that non-linear programming and optimization methods provide a promising method for pore network modeling when computed tomography imaging may not be readily available.

  2. The Cartridge Theory: a description of the functioning of horizontal subsurface flow constructed wetlands for wastewater treatment, based on modelling results.

    PubMed

    Samsó, Roger; García, Joan

    2014-03-01

    Despite the fact that horizontal subsurface flow constructed wetlands have been in operation for several decades now, there is still no clear understanding of some of their most basic internal functioning patterns. To fill this knowledge gap, on this paper we present what we call "The Cartridge Theory". This theory was derived from simulation results obtained with the BIO_PORE model and explains the functioning of urban wastewater treatment wetlands based on the interaction between bacterial communities and the accumulated solids leading to clogging. In this paper we start by discussing some changes applied to the biokinetic model implemented in BIO_PORE (CWM1) so that the growth of bacterial communities is consistent with a well-known population dynamics models. This discussion, combined with simulation results for a pilot wetland system, led to the introduction of "The Cartridge Theory", which states that the granular media of horizontal subsurface flow wetlands can be assimilated to a generic cartridge which is progressively consumed (clogged) with inert solids from inlet to outlet. Simulations also revealed that bacterial communities are poorly distributed within the system and that their location is not static but changes over time, moving towards the outlet as a consequence of the progressive clogging of the granular media. According to these findings, the life-span of constructed wetlands corresponds to the time when bacterial communities are pushed as much towards the outlet that their biomass is not anymore sufficient to remove the desirable proportion of the influent pollutants. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Stochastic foundations in nonlinear density-regulation growth

    NASA Astrophysics Data System (ADS)

    Méndez, Vicenç; Assaf, Michael; Horsthemke, Werner; Campos, Daniel

    2017-08-01

    In this work we construct individual-based models that give rise to the generalized logistic model at the mean-field deterministic level and that allow us to interpret the parameters of these models in terms of individual interactions. We also study the effect of internal fluctuations on the long-time dynamics for the different models that have been widely used in the literature, such as the theta-logistic and Savageau models. In particular, we determine the conditions for population extinction and calculate the mean time to extinction. If the population does not become extinct, we obtain analytical expressions for the population abundance distribution. Our theoretical results are based on WKB theory and the probability generating function formalism and are verified by numerical simulations.

  4. 40 CFR Table 3 to Subpart Ffff of... - Model Rule-Operating Limits for Incinerators and Wet Scrubbers

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Emission Guidelines and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before December 9, 2004 Pt. 60, Subpt. FFFF, Table 3 Table 3 to Subpart FFFF of Part 60...

  5. 40 CFR Table 3 to Subpart Ffff of... - Model Rule-Operating Limits for Incinerators and Wet Scrubbers

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Emission Guidelines and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before December 9, 2004 Pt. 60, Subpt. FFFF, Table 3 Table 3 to Subpart FFFF of Part 60...

  6. 40 CFR Table 3 to Subpart Ffff of... - Model Rule-Operating Limits for Incinerators and Wet Scrubbers

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Emission Guidelines and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before December 9, 2004 Pt. 60, Subpt. FFFF, Table 3 Table 3 to Subpart FFFF of Part 60...

  7. [Construction of the Time Management Scale and examination of the influence of time management on psychological stress response].

    PubMed

    Imura, Tomoya; Takamura, Masahiro; Okazaki, Yoshihiro; Tokunaga, Satoko

    2016-10-01

    We developed a scale to measure time management and assessed its reliability and validity. We then used this scale to examine the impact of time management on psychological stress response. In Study 1-1, we developed the scale and assessed its internal consistency and criterion-related validity. Findings from a factor analysis revealed three elements of time management, “time estimation,” “time utilization,” and “taking each moment as it comes.” In Study 1-2, we assessed the scale’s test-retest reliability. In Study 1-3, we assessed the validity of the constructed scale. The results indicate that the time management scale has good reliability and validity. In Study 2, we performed a covariance structural analysis to verify our model that hypothesized that time management influences perceived control of time and psychological stress response, and perceived control of time influences psychological stress response. The results showed that time estimation increases the perceived control of time, which in turn decreases stress response. However, we also found that taking each moment as it comes reduces perceived control of time, which in turn increases stress response.

  8. Key management and encryption under the bounded storage model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draelos, Timothy John; Neumann, William Douglas; Lanzone, Andrew J.

    2005-11-01

    There are several engineering obstacles that need to be solved before key management and encryption under the bounded storage model can be realized. One of the critical obstacles hindering its adoption is the construction of a scheme that achieves reliable communication in the event that timing synchronization errors occur. One of the main accomplishments of this project was the development of a new scheme that solves this problem. We show in general that there exist message encoding techniques under the bounded storage model that provide an arbitrarily small probability of transmission error. We compute the maximum capacity of this channelmore » using the unsynchronized key-expansion as side-channel information at the decoder and provide tight lower bounds for a particular class of key-expansion functions that are pseudo-invariant to timing errors. Using our results in combination with Dziembowski et al. [11] encryption scheme we can construct a scheme that solves the timing synchronization error problem. In addition to this work we conducted a detailed case study of current and future storage technologies. We analyzed the cost, capacity, and storage data rate of various technologies, so that precise security parameters can be developed for bounded storage encryption schemes. This will provide an invaluable tool for developing these schemes in practice.« less

  9. A psychosocial analysis of parents' decisions for limiting their young child's screen time: An examination of attitudes, social norms and roles, and control perceptions.

    PubMed

    Hamilton, Kyra; Spinks, Teagan; White, Katherine M; Kavanagh, David J; Walsh, Anne M

    2016-05-01

    Preschool-aged children spend substantial amounts of time engaged in screen-based activities. As parents have considerable control over their child's health behaviours during the younger years, it is important to understand those influences that guide parents' decisions about their child's screen time behaviours. A prospective design with two waves of data collection, 1 week apart, was adopted. Parents (n = 207) completed a Theory of Planned Behaviour (TPB)-based questionnaire, with the addition of parental role construction (i.e., parents' expectations and beliefs of responsibility for their child's behaviour) and past behaviour. A number of underlying beliefs identified in a prior pilot study were also assessed. The model explained 77% (with past behaviour accounting for 5%) of the variance in intention and 50% (with past behaviour accounting for 3%) of the variance in parental decisions to limit child screen time. Attitude, subjective norms, perceived behavioural control, parental role construction, and past behaviour predicted intentions, and intentions and past behaviour predicted follow-up behaviour. Underlying screen time beliefs (e.g., increased parental distress, pressure from friends, inconvenience) were also identified as guiding parents' decisions. Results support the TPB and highlight the importance of beliefs for understanding parental decisions for children's screen time behaviours, as well as the addition of parental role construction. This formative research provides necessary depth of understanding of sedentary lifestyle behaviours in young children which can be adopted in future interventions to test the efficacy of the TPB mechanisms in changing parental behaviour for their child's health. What is already known on this subject? Identifying determinants of child screen time behaviour is vital to the health of young people. Social-cognitive and parental role constructions are key influences of parental decision-making. Little is known about the processes guiding parents' decisions to limit their child's screen time. What does this study add? Parental role construction and TPB social-cognitive factors influence parental decisions. The beliefs of parents for their child's behaviour were identified. A range of beliefs guide parents' decisions for their child's screen time viewing. © 2015 The British Psychological Society.

  10. Comparison Of The Global Analytic Models Of The Main Geomagnetic Field With The Stratospheric Balloon Magnetic Data 335

    NASA Astrophysics Data System (ADS)

    Tsvetkov, Yu.; Filippov, S.; Frunze, A.

    2013-12-01

    Three global analytical models of a main geomagnetic field constructed by satellite data are used: model IGRF, Daily Mean Spherical Harmonic Models (DMSHM), and model EMM/2010, and also scalar data of geomagnetic field and its gradients, received in stratospheric balloon gradient magnetic surveys at altitudes of ~30 km. At these altitudes the regional magnetic field is formed from all sources of the Earth's crust. It enables to receive along lengthy routes of surveys the fullest data on regional and longwave-lenght magnetic anomalies. Model DMSHM is used at extracting of magnetic anomalies for elimination of a secular variation up to significant value 0,2 nT. The model can be constructed within the limits of ± 1 months from the moment stratospheric balloon surveys with beneficial day terms with magnetic activity up to Kp <20, that leads to an error of representation of main MFE equal ±5 нТл. It is possible at presence acting for the period of stratospheric balloon magnetic survey of the satellite, for example, Swarm. On stratospheric balloon data it is shown, that model EMM/2010 unsatisfactorily displays MFE at altitude of 30 km. Hence, the qualitative model of the constant (main and anomaly) magnetic field cannot be constructed only with use of satellite and ground data. The improved model constant MFE, constructed according to satellite and stratospheric balloon magnetic surveys, developed up to a degree and the order m=n=720, will have a reliable data about regional crust magnetic field, hence, and about deep magnetic structure of the Earth's crust. The use gradient magnetic surveys aboard stratospheric balloons allows to find the places alternating approximately through 3000 km in which there are no magnetic anomalies. In these places probably to supervise satellite magnetic models for a range of altitude of 20-40 km, timed to stratospheric balloon magnetic surveys.

  11. Reduced rank models for travel time estimation of low order mode pulses.

    PubMed

    Chandrayadula, Tarun K; Wage, Kathleen E; Worcester, Peter F; Dzieciuch, Matthew A; Mercer, James A; Andrew, Rex K; Howe, Bruce M

    2013-10-01

    Mode travel time estimation in the presence of internal waves (IWs) is a challenging problem. IWs perturb the sound speed, which results in travel time wander and mode scattering. A standard approach to travel time estimation is to pulse compress the broadband signal, pick the peak of the compressed time series, and average the peak time over multiple receptions to reduce variance. The peak-picking approach implicitly assumes there is a single strong arrival and does not perform well when there are multiple arrivals due to scattering. This article presents a statistical model for the scattered mode arrivals and uses the model to design improved travel time estimators. The model is based on an Empirical Orthogonal Function (EOF) analysis of the mode time series. Range-dependent simulations and data from the Long-range Ocean Acoustic Propagation Experiment (LOAPEX) indicate that the modes are represented by a small number of EOFs. The reduced-rank EOF model is used to construct a travel time estimator based on the Matched Subspace Detector (MSD). Analysis of simulation and experimental data show that the MSDs are more robust to IW scattering than peak picking. The simulation analysis also highlights how IWs affect the mode excitation by the source.

  12. Factor Structure and Stability of Smoking-Related Health Beliefs in the National Lung Screening Trial

    PubMed Central

    Koblitz, Amber R.; Persoskie, Alexander; Ferrer, Rebecca A.; Klein, William M. P.; Dwyer, Laura A.; Park, Elyse R.

    2016-01-01

    Introduction: Absolute and comparative risk perceptions, worry, perceived severity, perceived benefits, and self-efficacy are important theoretical determinants of tobacco use, but no measures have been validated to ensure the discriminant validity as well as test-retest reliability of these measures in the tobacco context. The purpose of the current study is to examine the reliability and factor structure of a measure assessing smoking-related health cognitions and emotions in a national sample of current and former heavy smokers in the National Lung Screening Trial. Methods: A sub-study of the National Lung Screening Trial assessed current and former smokers’ (age 55–74; N = 4379) self-reported health cognitions and emotions at trial enrollment and at 12-month follow-up. Items were derived from the Health Belief Model and Self-Regulation Model. Results: An exploratory factor analysis of baseline responses revealed a five-factor structure for former smokers (risk perceptions, worry, perceived severity, perceived benefits, and self-efficacy) and a six-factor structure for current smokers, such that absolute risk and comparative risk perceptions emerged as separate factors. A confirmatory factor analysis of 12-month follow-up responses revealed a good fit for the five latent constructs for former smokers and six latent constructs for current smokers. Longitudinal stability of these constructs was also demonstrated. Conclusions: This is the first study to examine tobacco-related health cognition and emotional constructs over time in current and former heavy smokers undergoing lung screening. This study found that the theoretical constructs were stable across time and that the factor structure differed based on smoking status (current vs. former). PMID:25964503

  13. Simulation and Modeling of Positrons and Electrons in advanced Time-of-Flight Positron Annihilation Induced Auger Electron Spectroscopy Systems

    NASA Astrophysics Data System (ADS)

    Joglekar, Prasad; Shastry, Karthik; Satyal, Suman; Weiss, Alexander

    2011-10-01

    Time of Flight Positron Annihilation Induced Auger Electron Spectroscopy (T-O-F PAES) is a highly surface selective analytical technique in which elemental identification is accomplished through a measurement of the flight time distributions of Auger electrons resulting from the annihilation of core electron by positrons. SIMION charged particle optics simulation software was used to model the trajectories both the incident positrons and outgoing electrons in our existing T-O-F PAES system as well as in a new system currently under construction in our laboratory. The implication of these simulation regarding the instrument design and performance are discussed.

  14. Wind energy system time-domain (WEST) analyzers

    NASA Technical Reports Server (NTRS)

    Dreier, M. E.; Hoffman, J. A.

    1981-01-01

    A portable analyzer which simulates in real time the complex nonlinear dynamics of horizontal axis wind energy systems was constructed. Math models for an aeroelastic rotor featuring nonlinear aerodynamic and inertial terms were implemented with high speed digital controllers and analog calculation. This model was combined with other math models of elastic supports, control systems, a power train and gimballed rotor kinematics. A stroboscopic display system graphically depicting distributed blade loads, motion, and other aerodynamic functions on a cathode ray tube is included. Limited correlation efforts showed good comparison between the results of this analyzer and other sophisticated digital simulations. The digital simulation results were successfully correlated with test data.

  15. Estimating procedure times for surgeries by determining location parameters for the lognormal model.

    PubMed

    Spangler, William E; Strum, David P; Vargas, Luis G; May, Jerrold H

    2004-05-01

    We present an empirical study of methods for estimating the location parameter of the lognormal distribution. Our results identify the best order statistic to use, and indicate that using the best order statistic instead of the median may lead to less frequent incorrect rejection of the lognormal model, more accurate critical value estimates, and higher goodness-of-fit. Using simulation data, we constructed and compared two models for identifying the best order statistic, one based on conventional nonlinear regression and the other using a data mining/machine learning technique. Better surgical procedure time estimates may lead to improved surgical operations.

  16. Classical integrable defects as quasi Bäcklund transformations

    NASA Astrophysics Data System (ADS)

    Doikou, Anastasia

    2016-10-01

    We consider the algebraic setting of classical defects in discrete and continuous integrable theories. We derive the ;equations of motion; on the defect point via the space-like and time-like description. We then exploit the structural similarity of these equations with the discrete and continuous Bäcklund transformations. And although these equations are similar they are not exactly the same to the Bäcklund transformations. We also consider specific examples of integrable models to demonstrate our construction, i.e. the Toda chain and the sine-Gordon model. The equations of the time (space) evolution of the defect (discontinuity) degrees of freedom for these models are explicitly derived.

  17. Reservoir monitoring and characterization using satellite geodetic data: Interferometric Synthetic Aperture Radar observations from the Krechba field, Algeria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasco, D.W.; Ferretti, Alessandro; Novali, Fabrizio

    2008-05-01

    Deformation in the material overlying an active reservoir is used to monitor pressure change at depth. A sequence of pressure field estimates, eleven in all, allow us to construct a measure of diffusive travel time throughout the reservoir. The dense distribution of travel time values means that we can construct an exactly linear inverse problem for reservoir flow properties. Application to Interferometric Synthetic Aperture Radar (InSAR) data gathered over a CO{sub 2} injection in Algeria reveals pressure propagation along two northwest trending corridors. An inversion of the travel times indicates the existence of two northwest-trending high permeability zones. The highmore » permeability features trend in the same direction as the regional fault and fracture zones. Model parameter resolution estimates indicate that the features are well resolved.« less

  18. A simplified building airflow model for agent concentration prediction.

    PubMed

    Jacques, David R; Smith, David A

    2010-11-01

    A simplified building airflow model is presented that can be used to predict the spread of a contaminant agent from a chemical or biological attack. If the dominant means of agent transport throughout the building is an air-handling system operating at steady-state, a linear time-invariant (LTI) model can be constructed to predict the concentration in any room of the building as a result of either an internal or external release. While the model does not capture weather-driven and other temperature-driven effects, it is suitable for concentration predictions under average daily conditions. The model is easily constructed using information that should be accessible to a building manager, supplemented with assumptions based on building codes and standard air-handling system design practices. The results of the model are compared with a popular multi-zone model for a simple building and are demonstrated for building examples containing one or more air-handling systems. The model can be used for rapid concentration prediction to support low-cost placement strategies for chemical and biological detection sensors.

  19. Space Shuttle and Launch Pad Computational Fluid Dynamics Model for Lift-off Debris Transport Analysis

    NASA Technical Reports Server (NTRS)

    Dougherty, Sam; West, Jeff; Droege, Alan; Wilson, Josh; Liever, Peter; Slaby, Matthew

    2006-01-01

    This paper discusses the Space Shuttle Lift-off CFD model developed for potential Lift-off Debris transport for return-to-flight. The Lift-off portion of the flight is defined as the time starting with tanking of propellants until tower clear, approximately T0+6 seconds, where interactions with the launch pad cease. A CFD model containing the Space Shuttle and launch Pad geometry has been constructed and executed. Simplifications required in the construction of the model are presented and discussed. A body-fitted overset grid of up to 170 million grid points was developed which allowed positioning of the Vehicle relative to the Launch Pad over the first six seconds of Climb-Out. The CFD model works in conjunction with a debris particle transport model and a debris particle impact damage tolerance model. These models have been used to assess the interactions of the Space Shuttle plumes, the wind environment, and their interactions with each other and the Launch Pad and their ultimate effect on potential debris during Lift-off.

  20. [Impacts of forest and precipitation on runoff and sediment in Tianshui watershed and GM models].

    PubMed

    Ouyang, H

    2000-12-01

    This paper analyzed the impacts of foret stand volume and precipitation on annual erosion modulus, mean sediment, maximum sediment, mean runoff, maximum runoff, minimum runoff, mean water level, maximum water level and minimum water level in Tianshui watershed, and also analyzed the effect of the variation of forest stand volume on monthly mean runoff, minimum runoff and mean water level. The dynamic models of grey system GM(1, N) were constructed to simulate the changes of these hydrological elements. The dynamic GM models on the impact of stand volumes of different forest types(Chinese fir, masson pine and broad-leaved forests) with different age classes(young, middle-aged, mature and over-mature) and that of precipitation on the hydrological elements were also constructed, and their changes with time were analyzed.

Top