Science.gov

Sample records for queuing theory based

  1. Queuing Theory and Reference Transactions.

    ERIC Educational Resources Information Center

    Terbille, Charles

    1995-01-01

    Examines the implications of applying the queuing theory to three different reference situations: (1) random patron arrivals; (2) random durations of transactions; and (3) use of two librarians. Tables and figures represent results from spreadsheet calculations of queues for each reference situation. (JMV)

  2. An application of queuing theory to waterfowl migration

    USGS Publications Warehouse

    Sojda, Richard S.; Cornely, John E.; Fredrickson, Leigh H.

    2002-01-01

    There has always been great interest in the migration of waterfowl and other birds. We have applied queuing theory to modelling waterfowl migration, beginning with a prototype system for the Rocky Mountain Population of trumpeter swans (Cygnus buccinator) in Western North America. The queuing model can be classified as a D/BB/28 system, and we describe the input sources, service mechanism, and network configuration of queues and servers. The intrinsic nature of queuing theory is to represent the spatial and temporal characteristics of entities and how they move, are placed in queues, and are serviced. The service mechanism in our system is an algorithm representing how swans move through the flyway based on seasonal life cycle events. The system uses an observed number of swans at each of 27 areas for a breeding season as input and simulates their distribution through four seasonal steps. The result is a simulated distribution of birds for the subsequent year's breeding season. The model was built as a multiagent system with one agent handling movement algorithms, with one facilitating user interface, and with one to seven agents representing specific geographic areas for which swan management interventions can be implemented. The many parallels in queuing model servers and service mechanisms with waterfowl management areas and annual life cycle events made the transfer of the theory to practical application straightforward.

  3. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  4. Queuing Theory: An Alternative Approach to Educational Research.

    ERIC Educational Resources Information Center

    Huyvaert, Sarah H.

    Queuing theory is examined in this paper in order to determine if the theory could be applied in educational settings. It is defined as a form of operations research that uses mathematical formulas and/or computer simulation to study wait and congestion in a system and, through the study of these visible phenomena, to discover malfunctions within…

  5. Application of queuing theory in production-inventory optimization

    NASA Astrophysics Data System (ADS)

    Rashid, Reza; Hoseini, Seyed Farzad; Gholamian, M. R.; Feizabadi, Mohammad

    2015-07-01

    This paper presents a mathematical model for an inventory control system in which customers' demands and suppliers' service time are considered as stochastic parameters. The proposed problem is solved through queuing theory for a single item. In this case, transitional probabilities are calculated in steady state. Afterward, the model is extended to the case of multi-item inventory systems. Then, to deal with the complexity of this problem, a new heuristic algorithm is developed. Finally, the presented bi-level inventory-queuing model is implemented as a case study in Electroestil Company.

  6. Application of queuing theory in inventory systems with substitution flexibility

    NASA Astrophysics Data System (ADS)

    Seyedhoseini, S. M.; Rashid, Reza; Kamalpour, Iman; Zangeneh, Erfan

    2015-01-01

    Considering the competition in today's business environment, tactical planning of a supply chain becomes more complex than before. In many multi-product inventory systems, substitution flexibility can improve profits. This paper aims to prepare a comprehensive substitution inventory model, where an inventory system with two substitute products with ignorable lead time has been considered, and effects of simultaneous ordering have been examined. In this paper, demands of customers for both of the products have been regarded as stochastic parameters, and queuing theory has been used to construct a mathematical model. The model has been coded by C++, and it has been analyzed due to a real example, where the results indicate efficiency of proposed model.

  7. Using Queuing Theory and Simulation Model to Optimize Hospital Pharmacy Performance

    PubMed Central

    Bahadori, Mohammadkarim; Mohammadnejhad, Seyed Mohsen; Ravangard, Ramin; Teymourzadeh, Ehsan

    2014-01-01

    Background: Hospital pharmacy is responsible for controlling and monitoring the medication use process and ensures the timely access to safe, effective and economical use of drugs and medicines for patients and hospital staff. Objectives: This study aimed to optimize the management of studied outpatient pharmacy by developing suitable queuing theory and simulation technique. Patients and Methods: A descriptive-analytical study conducted in a military hospital in Iran, Tehran in 2013. A sample of 220 patients referred to the outpatient pharmacy of the hospital in two shifts, morning and evening, was selected to collect the necessary data to determine the arrival rate, service rate, and other data needed to calculate the patients flow and queuing network performance variables. After the initial analysis of collected data using the software SPSS 18, the pharmacy queuing network performance indicators were calculated for both shifts. Then, based on collected data and to provide appropriate solutions, the queuing system of current situation for both shifts was modeled and simulated using the software ARENA 12 and 4 scenarios were explored. Results: Results showed that the queue characteristics of the studied pharmacy during the situation analysis were very undesirable in both morning and evening shifts. The average numbers of patients in the pharmacy were 19.21 and 14.66 in the morning and evening, respectively. The average times spent in the system by clients were 39 minutes in the morning and 35 minutes in the evening. The system utilization in the morning and evening were, respectively, 25% and 21%. The simulation results showed that reducing the staff in the morning from 2 to 1 in the receiving prescriptions stage didn't change the queue performance indicators. Increasing one staff in filling prescription drugs could cause a decrease of 10 persons in the average queue length and 18 minutes and 14 seconds in the average waiting time. On the other hand, simulation results showed that in the evening, decreasing the staff from 2 to 1 in the delivery of prescription drugs, changed the queue performance indicators very little. Increasing a staff to fill prescription drugs could cause a decrease of 5 persons in the average queue length and 8 minutes and 44 seconds in the average waiting time. Conclusions: The patients' waiting times and the number of patients waiting to receive services in both shifts could be reduced by using multitasking persons and reallocating them to the time-consuming stage of filling prescriptions, using queuing theory and simulation techniques. PMID:24829791

  8. Allocation of computer ports within a terminal switching network: an application of queuing theory to gandalf port contenders

    SciTech Connect

    Vahle, M.O.

    1982-03-01

    Queuing theory is applied to the problem of assigning computer ports within a terminal switching network to maximize the likelihood of instant connect. A brief background of the network is included to focus on the statement of the problem.

  9. Queuing theory to guide the implementation of a heart failure inpatient registry program.

    PubMed

    Zai, Adrian H; Farr, Kit M; Grant, Richard W; Mort, Elizabeth; Ferris, Timothy G; Chueh, Henry C

    2009-01-01

    OBJECTIVE The authors previously implemented an electronic heart failure registry at a large academic hospital to identify heart failure patients and to connect these patients with appropriate discharge services. Despite significant improvements in patient identification and connection rates, time to connection remained high, with an average delay of 3.2 days from the time patients were admitted to the time connections were made. Our objective for this current study was to determine the most effective solution to minimize time to connection. DESIGN We used a queuing theory model to simulate 3 different potential solutions to decrease the delay from patient identification to connection with discharge services. MEASUREMENTS The measures included average rate at which patients were being connected to the post discharge heart failure services program, average number of patients in line, and average patient waiting time. RESULTS Using queuing theory model simulations, we were able to estimate for our current system the minimum rate at which patients need to be connected (262 patients/mo), the ideal patient arrival rate (174 patients/mo) and the maximal patient arrival rate that could be achieved by adding 1 extra nurse (348 patients/mo). CONCLUSIONS Our modeling approach was instrumental in helping us characterize key process parameters and estimate the impact of adding staff on the time between identifying patients with heart failure and connecting them with appropriate discharge services. PMID:19390108

  10. Modeling panel detection frequencies by queuing system theory: an application in gas chromatography olfactometry.

    PubMed

    Bult, Johannes H F; van Putten, Bram; Schifferstein, Hendrik N J; Roozen, Jacques P; Voragen, Alphons G J; Kroeze, Jan H A

    2004-10-01

    In continuous vigilance tasks, the number of coincident panel responses to stimuli provides an index of stimulus detectability. To determine whether this number is due to chance, panel noise levels have been approximated by the maximum coincidence level obtained in stimulus-free conditions. This study proposes an alternative method by which to assess noise levels, derived from queuing system theory (QST). Instead of critical coincidence levels, QST modeling estimates the duration of coinciding responses in the absence of stimuli. The proposed method has the advantage over previous approaches that it yields more reliable noise estimates and allows for statistical testing. The method was applied in an olfactory detection experiment using 16 panelists in stimulus-present and stimulus-free conditions. We propose that QST may be used as an alternative to signal detection theory for analyzing data from continuous vigilance tasks. PMID:15751471

  11. Spreadsheet Analysis Of Queuing In A Computer Network

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1992-01-01

    Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.

  12. Modeling ad hoc network based on 802.11 DCF by queuing network analyzer

    NASA Astrophysics Data System (ADS)

    Zhang, Wenbin; Zhang, Zhongzhao

    2007-11-01

    In this paper, we present an analytic model for evaluating the average end-to-end delay and per-node throughput in an IEEE802.11MAC DCF based wireless network. By virtue of QNA and probability generation function, we model ad hoc network as open M/G/1 queuing networks and obtain the closed form expressions for average end-to-end delay. Simulations based on NS2 validate the accuracy of our model.

  13. Effects of diversity and procrastination in priority queuing theory: The different power law regimes

    NASA Astrophysics Data System (ADS)

    Saichev, A.; Sornette, D.

    2010-01-01

    Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law ˜1/t? with 0queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a “time deficit” control parameter ? and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail ˜1/t1/2 , resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/t? , with ??(0.5,?) , including the well-known case 1/t . We also study the effect of “procrastination,” defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence.

  14. Effects of diversity and procrastination in priority queuing theory: the different power law regimes.

    PubMed

    Saichev, A; Sornette, D

    2010-01-01

    Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law approximately 1/t(alpha) with 0queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a "time deficit" control parameter beta and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail approximately 1/t(1/2), resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/t(alpha), with alpha is an element of (0.5,infinity), including the well-known case 1/t. We also study the effect of "procrastination," defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence. PMID:20365433

  15. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    SciTech Connect

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.

    2015-03-10

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.

  16. Improving queuing service at McDonald's

    NASA Astrophysics Data System (ADS)

    Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.

    2014-07-01

    Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.

  17. Using Queuing Theory and Simulation Modelling to Reduce Waiting Times in An Iranian Emergency Department

    PubMed Central

    Haghighinejad, Hourvash Akbari; Kharazmi, Erfan; Hatam, Nahid; Yousefi, Sedigheh; Hesami, Seyed Ali; Danaei, Mina; Askarian, Mehrdad

    2016-01-01

    Background: Hospital emergencies have an essential role in health care systems. In the last decade, developed countries have paid great attention to overcrowding crisis in emergency departments. Simulation analysis of complex models for which conditions will change over time is much more effective than analytical solutions and emergency department (ED) is one of the most complex models for analysis. This study aimed to determine the number of patients who are waiting and waiting time in emergency department services in an Iranian hospital ED and to propose scenarios to reduce its queue and waiting time. Methods: This is a cross-sectional study in which simulation software (Arena, version 14) was used. The input information was extracted from the hospital database as well as through sampling. The objective was to evaluate the response variables of waiting time, number waiting and utilization of each server and test the three scenarios to improve them. Results: Running the models for 30 days revealed that a total of 4088 patients left the ED after being served and 1238 patients waited in the queue for admission in the ED bed area at end of the run (actually these patients received services out of their defined capacity). The first scenario result in the number of beds had to be increased from 81 to179 in order that the number waiting of the “bed area” server become almost zero. The second scenario which attempted to limit hospitalization time in the ED bed area to the third quartile of the serving time distribution could decrease the number waiting to 586 patients. Conclusion: Doubling the bed capacity in the emergency department and consequently other resources and capacity appropriately can solve the problem. This includes bed capacity requirement for both critically ill and less critically ill patients. Classification of ED internal sections based on severity of illness instead of medical specialty is another solution. PMID:26793727

  18. Design and Implementation of High-Speed Input-Queued Switches Based on a Fair Scheduling Algorithm

    NASA Astrophysics Data System (ADS)

    Hu, Qingsheng; Zhao, Hua-An

    To increase both the capacity and the processing speed for input-queued (IQ) switches, we proposed a fair scalable scheduling architecture (FSSA). By employing FSSA comprised of several cascaded sub-schedulers, a large-scale high performance switches or routers can be realized without the capacity limitation of monolithic device. In this paper, we present a fair scheduling algorithm named FSSA_DI based on an improved FSSA where a distributed iteration scheme is employed, the scheduler performance can be improved and the processing time can be reduced as well. Simulation results show that FSSA_DI achieves better performance on average delay and throughput under heavy loads compared to other existing algorithms. Moreover, a practical 64 × 64 FSSA using FSSA_DI algorithm is implemented by four Xilinx Vertex-4 FPGAs. Measurement results show that the data rates of our solution can be up to 800Mbps and the tradeoff between performance and hardware complexity has been solved peacefully.

  19. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    PubMed

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  20. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    PubMed Central

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  1. Queuing Up

    SciTech Connect

    Miller, Nicholas; Zavadil, Robert; Ellis, Abraham; Muljadi, Eduard; Camm, Ernst; Kirby, Brendan J

    2007-01-01

    The knowledge base of the electric power system engineering community continues to grow with installed capacity of wind generation in North America. While this process has certainly occurred at other times in the industry with other technologies, the relatively explosive growth, the compressed time frames from project conception to commissioning, and the unconventional characteristics of wind generation make this period in the industry somewhat unique. Large wind generation facilities are necessarily evolving to look more and more like conventional generating plants in terms of their ability to interact with the transmission network in a way that does not compromise performance or system reliability. Such an evolution has only been possible through the cumulative contributions of an ever-growing number of power system engineers who have delved into the unique technologies and technical challenges presented by wind generation. The industry is still only part of the way up the learning curve, however. Numerous technical challenges remain, and as has been found, each new wind generation facility has the potential to generate some new questions. With the IEEE PES expanding its presence and activities in this increasingly significant commercial arena, the prospects for staying "ahead of the curve" are brightened.

  2. Using queuing theory to analyse the government's 4-H completion time target in accident and emergency departments.

    PubMed

    Mayhew, L; Smith, D

    2008-03-01

    This paper uses a queuing model to evaluate completion times in Accident and Emergency (A&E) departments in the light of the Government target of completing and discharging 98% of patients inside 4 h. It illustrates how flows though an A&E can be accurately represented as a queuing process, how outputs can be used to visualise and interpret the 4-h Government target in a simple way and how the model can be used to assess the practical achievability of A&E targets in the future. The paper finds that A&E targets have resulted in significant improvements in completion times and thus deal with a major source of complaint by users of the National Health Service in the U.K. It suggests that whilst some of this improvement is attributable to better management, some is also due to the way some patients in A&E are designated and therefore counted through the system. It finds for example that the current target would not have been possible without some form of patient re-designation or re-labelling taking place. Further it finds that the current target is so demanding that the integrity of reported performance is open to question. Related incentives and demand management issues resulting from the target are also briefly discussed. PMID:18390164

  3. Human Factors of Queuing: A Library Circulation Model.

    ERIC Educational Resources Information Center

    Mansfield, Jerry W.

    1981-01-01

    Classical queuing theories and their accompanying service facilities totally disregard the human factors in the name of efficiency. As library managers we need to be more responsive to human needs in the design of service points and make every effort to minimize queuing and queue frustration. Five references are listed. (Author/RAA)

  4. A queuing model for road traffic simulation

    SciTech Connect

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-03-10

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.

  5. Development of Markov Chain-Based Queuing Model and Wireless Infrastructure for EV to Smart Meter Communication in V2G

    NASA Astrophysics Data System (ADS)

    Santoshkumar; Udaykumar, R. Y.

    2015-04-01

    The electrical vehicles (EVs) can be connected to the grid for power transaction. The vehicle-to-grid (V2G) supports the grid requirements and helps in maintaining the load demands. The grid control center (GCC), aggregator and EV are three key entities in V2G communication. The GCC sends the information about power requirements to the aggregator. The aggregator after receiving the information from the GCC sends the information to the EVs. Based on the information, the interested EV owners participate in power transaction with the grid. The aggregator facilitates the EVs by providing the parking and charging slot. In this paper the queuing model for EVs connected to the grid and development of wireless infrastructure for the EV to Smart Meter communication is proposed. The queuing model is developed and simulated. The path loss models for WiMAX are analyzed and compared. Also, the physical layer of WiMAX protocol is modeled and simulated for the EV to Smart Meter communication in V2G.

  6. Quality and operations of portable X-ray examination procedures in the emergency room: queuing theory at work.

    PubMed

    Abujudeh, Hani; Vuong, Bill; Baker, Stephen R

    2005-07-01

    The objective of this study was to evaluate the operation of the portable X-ray machine in relation to examinations ordered by the Emergency Department at the University of Medicine and Dentistry of New Jersey, as well as to identify any bottlenecks hindering the performance of the aforementioned system. To do so, the activity of the portable X-ray was monitored in the period from 8 June 2004 to 24 June 2004, as well as from 6 July 2004 to 12 July 2004, yielding 11 days of data and 116 individual X-ray examinations. During observation times was noted for various checkpoints in the procedure. Using the data gathered, the average input, output, processing times, and variance were calculated. In turn, these values were used to calculate the response times for the Ordering Phase (5.502 min), traveling (2.483 min), Examination Phase (4.453 min), returning (3.855 min), Order Processing Phase (2.962 min), and the Development Phase (3.437 min). These phases were combined for a total of 22.721 min from the time the examination was placed to the time the X-ray films were uploaded to the PACS computer network. Based on these calculations, the Ordering Phase was determined to be the single largest bottleneck in the portable X-ray system. The Examination Phase also represented the second largest bottleneck for a combined total of 44% of the total response time. PMID:16133619

  7. Ambulance deployment with the hypercube queuing model.

    PubMed

    Larson, R C

    1982-01-01

    A computer-implemented mathematical model has been developed to assist planners in the spatial deployment and dispatching of ambulances. The model incorporates uncertainties in the arrival times, locations, and service requirements of patients, building on the branch of operations research known as queuing theory. Several system-performance measures are generated by the model, including mean neighborhood-specific response times, mean utilization of each ambulance, and statistical profiles of ambulance response patterns. This model has been implemented by the Department of Health and Hospitals of the City of Boston. PMID:7132820

  8. Capacity utilization study for aviation security cargo inspection queuing system

    NASA Astrophysics Data System (ADS)

    Allgood, Glenn O.; Olama, Mohammed M.; Lake, Joe E.; Brumback, Daryl

    2010-04-01

    In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system's ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.

  9. Capacity Utilization Study for Aviation Security Cargo Inspection Queuing System

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E; Brumback, Daryl L

    2010-01-01

    In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.

  10. Network Queuing System, Version 2.0

    NASA Technical Reports Server (NTRS)

    Walter, Howard; Bridges, Mike; Carver, Terrie; Kingsbury, Brent

    1993-01-01

    Network Queuing System (NQS) computer program is versatile batch- and device-queuing facility for single UNIX computer or group of computers in network. User invokes NQS collection of user-space programs to move batch and device jobs freely among different computers in network. Provides facilities for remote queuing, request routing, remote status, queue-status controls, batch-request resource quota limits, and remote output return. Revision of NQS provides for creation, deletion, addition, and setting of complexes aiding in limiting number of requests handled at one time. Also has improved device-oriented queues along with some revision of displays. Written in C language.

  11. Modeling and simulation of M/M/c queuing pharmacy system with adjustable parameters

    NASA Astrophysics Data System (ADS)

    Rashida, A. R.; Fadzli, Mohammad; Ibrahim, Safwati; Goh, Siti Rohana

    2016-02-01

    This paper studies a discrete event simulation (DES) as a computer based modelling that imitates a real system of pharmacy unit. M/M/c queuing theo is used to model and analyse the characteristic of queuing system at the pharmacy unit of Hospital Tuanku Fauziah, Kangar in Perlis, Malaysia. The input of this model is based on statistical data collected for 20 working days in June 2014. Currently, patient waiting time of pharmacy unit is more than 15 minutes. The actual operation of the pharmacy unit is a mixed queuing server with M/M/2 queuing model where the pharmacist is referred as the server parameters. DES approach and ProModel simulation software is used to simulate the queuing model and to propose the improvement for queuing system at this pharmacy system. Waiting time for each server is analysed and found out that Counter 3 and 4 has the highest waiting time which is 16.98 and 16.73 minutes. Three scenarios; M/M/3, M/M/4 and M/M/5 are simulated and waiting time for actual queuing model and experimental queuing model are compared. The simulation results show that by adding the server (pharmacist), it will reduce patient waiting time to a reasonable improvement. Almost 50% average patient waiting time is reduced when one pharmacist is added to the counter. However, it is not necessary to fully utilize all counters because eventhough M/M/4 and M/M/5 produced more reduction in patient waiting time, but it is ineffective since Counter 5 is rarely used.

  12. Application of queuing model in Dubai's busiest megaplex

    NASA Astrophysics Data System (ADS)

    Bhagchandani, Maneesha; Bajpai, Priti

    2013-09-01

    This paper provides a study and analysis of the extremely busy booking counters at the Megaplex in Dubai using the queuing model and simulation. Dubai is an emirate in UAE with a multicultural population. Majority of the population in Dubai is foreign born. Cinema is one of the major forms of entertainment. There are more than 13 megaplexes each with a number of screens ranging from 3 to 22. They screen movies in English, Arabic, Hindi and other languages. It has been observed that during the weekends megaplexes attract a large number of crowd resulting in long queues at the booking counters. One of the busiest megaplex was selected for the study. Queuing theory satisfies the model when tested in real time situation. The concepts of arrival rate, service rate, utilization rate, waiting time in the system, average number of people in the queue, using Little's Theorem and M/M/s queuing model along with simulation software have been used to suggest an empirical solution. The aim of the paper is twofold-To assess the present situation at the Megaplex and give recommendations to optimize the use of booking counters.

  13. Modeling patient flows using a queuing network with blocking.

    PubMed

    Koizumi, Naoru; Kuno, Eri; Smith, Tony E

    2005-02-01

    The downsizing and closing of state mental health institutions in Philadelphia in the 1990's led to the development of a continuum care network of residential-based services. Although the diversity of care settings increased, congestion in facilities caused many patients to unnecessarily spend extra days in intensive facilities. This study applies a queuing network system with blocking to analyze such congestion processes. "Blocking" denotes situations where patients are turned away from accommodations to which they are referred, and are thus forced to remain in their present facilities until space becomes available. Both mathematical and simulation results are presented and compared. Although queuing models have been used in numerous healthcare studies, the inclusion of blocking is still rare. We found that, in Philadelphia, the shortage of a particular type of facilities may have created "upstream blocking". Thus removal of such facility-specific bottlenecks may be the most efficient way to reduce congestion in the system as a whole. PMID:15782512

  14. Queuing Models of Tertiary Storage

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1996-01-01

    Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/

  15. Priority Queuing On A Parallel Data Bus

    NASA Technical Reports Server (NTRS)

    Wallis, D. E.

    1985-01-01

    Queuing strategy for communications along shared data bus minimizes number of data lines while always assuring user of highest priority given access to bus. New system handles up to 32 user demands on 17 data lines that previously serviced only 17 demands.

  16. Is Your Queuing System ADA-Compliant?

    ERIC Educational Resources Information Center

    Lawrence, David

    2002-01-01

    Discusses the Americans with Disabilities (ADA) and Uniform Federal Accessibility Standards (UFAS) regulations regarding public facilities' crowd control stanchions and queuing systems. The major elements are protruding objects and wheelchair accessibility. Describes how to maintain compliance with the regulations and offers a list of additional…

  17. Modeling Patient Flows Using a Queuing Network with Blocking

    PubMed Central

    KUNO, ERI; SMITH, TONY E.

    2015-01-01

    The downsizing and closing of state mental health institutions in Philadelphia in the 1990’s led to the development of a continuum care network of residential-based services. Although the diversity of care settings increased, congestion in facilities caused many patients to unnecessarily spend extra days in intensive facilities. This study applies a queuing network system with blocking to analyze such congestion processes. “Blocking” denotes situations where patients are turned away from accommodations to which they are referred, and are thus forced to remain in their present facilities until space becomes available. Both mathematical and simulation results are presented and compared. Although queuing models have been used in numerous healthcare studies, the inclusion of blocking is still rare. We found that, in Philadelphia, the shortage of a particular type of facilities may have created “upstream blocking”. Thus removal of such facility-specific bottlenecks may be the most efficient way to reduce congestion in the system as a whole. PMID:15782512

  18. Improving hospital bed occupancy and resource utilization through queuing modeling and evolutionary computation.

    PubMed

    Belciug, Smaranda; Gorunescu, Florin

    2015-02-01

    Scarce healthcare resources require carefully made policies ensuring optimal bed allocation, quality healthcare service, and adequate financial support. This paper proposes a complex analysis of the resource allocation in a hospital department by integrating in the same framework a queuing system, a compartmental model, and an evolutionary-based optimization. The queuing system shapes the flow of patients through the hospital, the compartmental model offers a feasible structure of the hospital department in accordance to the queuing characteristics, and the evolutionary paradigm provides the means to optimize the bed-occupancy management and the resource utilization using a genetic algorithm approach. The paper also focuses on a "What-if analysis" providing a flexible tool to explore the effects on the outcomes of the queuing system and resource utilization through systematic changes in the input parameters. The methodology was illustrated using a simulation based on real data collected from a geriatric department of a hospital from London, UK. In addition, the paper explores the possibility of adapting the methodology to different medical departments (surgery, stroke, and mental illness). Moreover, the paper also focuses on the practical use of the model from the healthcare point of view, by presenting a simulated application. PMID:25433363

  19. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  20. Some queuing network models of computer systems

    NASA Technical Reports Server (NTRS)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  1. Case study: applying management policies to manage distributed queuing systems

    NASA Astrophysics Data System (ADS)

    Neumair, Bernhard; Wies, René

    1996-06-01

    The increasing deployment of workstations and high performance endsystems in addition to the operation of mainframe computers leads to a situation where many companies can no longer afford for their expensive workstations to run idle for long hours during the night or with little load during daytime. Distributed queuing systems and batch systems (DQSs) provide an efficient basis to make use of these unexploited resources and allow corporations to replace expensive supercomputers with clustered workstations running DQSs. To employ these innovative DQSs on a large scale, the management policies for scheduling jobs, configuring queues, etc must be integrated in the overall management process for the IT infrastructure. For this purpose, the concepts of application management and management policies are introduced and discussed. The definition, automatic transformation, and implementation of policies on management platforms to effectively manage DQSs will show that policy-based application management is already possible using the existing management functionality found in today's systems.

  2. Agent-Based Literacy Theory

    ERIC Educational Resources Information Center

    McEneaney, John E.

    2006-01-01

    The purpose of this theoretical essay is to explore the limits of traditional conceptualizations of reader and text and to propose a more general theory based on the concept of a literacy agent. The proposed theoretical perspective subsumes concepts from traditional theory and aims to account for literacy online. The agent-based literacy theory…

  3. Queuing network approach for building evacuation planning

    NASA Astrophysics Data System (ADS)

    Ishak, Nurhanis; Khalid, Ruzelan; Baten, Md. Azizul; Nawawi, Mohd. Kamal Mohd.

    2014-12-01

    The complex behavior of pedestrians in a limited space layout can explicitly be modeled using an M/G/C/C state dependent queuing network. This paper implements the approach to study pedestrian flows through various corridors in a topological network. The best arrival rates and their impacts to the corridors' performances in terms of the throughput, blocking probability, expected number of occupants in the system and expected travel time were first measured using the M/G/C/C analytical model. These best arrival rates were then fed to its Network Flow Programming model to find the best arrival rates to source corridors and routes optimizing the network's total throughput. The analytical results were then validated using a simulation model. Various results of this study can be used to support the current Standard Operating Procedures (SOP) to efficiently and safely evacuate people in emergency cases.

  4. The Queued Service Observing Project at CFHT

    NASA Astrophysics Data System (ADS)

    Martin, Pierre; Savalle, Renaud; Vermeulen, Tom; Shapiro, Joshua N.

    2002-12-01

    In order to maximize the scientific productivity of the CFH12K mosaic wide-field imager (and soon MegaCam), the Queued Service Observing (QSO) mode was implemented at the Canada-France-Hawaii Telescope at the beginning of 2001. The QSO system consists of an ensemble of software components allowing for the submission of programs, the preparation of queues, and finally the execution and evaluation of observations. The QSO project is part of a broader system known as the New Observing Process (NOP). This system includes data acquisition, data reduction and analysis through a pipeline named Elixir, and a data archiving and distribution component (DADS). In this paper, we review several technical and operational aspects of the QSO project. In particular, we present our strategy, technical architecture, program submission system, and the tools developed for the preparation and execution of the queues. Our successful experience of over 150 nights of QSO operations is also discussed along with the future plans for queue observing with MegaCam and other instruments at CFHT.

  5. NQS - NETWORK QUEUING SYSTEM, VERSION 2.0 (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Walter, H.

    1994-01-01

    The Network Queuing System, NQS, is a versatile batch and device queuing facility for a single Unix computer or a group of networked computers. With the Unix operating system as a common interface, the user can invoke the NQS collection of user-space programs to move batch and device jobs freely around the different computer hardware tied into the network. NQS provides facilities for remote queuing, request routing, remote status, queue status controls, batch request resource quota limits, and remote output return. This program was developed as part of an effort aimed at tying together diverse UNIX based machines into NASA's Numerical Aerodynamic Simulator Processing System Network. This revision of NQS allows for creating, deleting, adding and setting of complexes that aid in limiting the number of requests to be handled at one time. It also has improved device-oriented queues along with some revision of the displays. NQS was designed to meet the following goals: 1) Provide for the full support of both batch and device requests. 2) Support all of the resource quotas enforceable by the underlying UNIX kernel implementation that are relevant to any particular batch request and its corresponding batch queue. 3) Support remote queuing and routing of batch and device requests throughout the NQS network. 4) Support queue access restrictions through user and group access lists for all queues. 5) Enable networked output return of both output and error files to possibly remote machines. 6) Allow mapping of accounts across machine boundaries. 7) Provide friendly configuration and modification mechanisms for each installation. 8) Support status operations across the network, without requiring a user to log in on remote target machines. 9) Provide for file staging or copying of files for movement to the actual execution machine. To support batch and device requests, NQS v.2 implements three queue types--batch, device and pipe. Batch queues hold and prioritize batch requests; device queues hold and prioritize device requests; pipe queues transport both batch and device requests to other batch, device, or pipe queues at local or remote machines. Unique to batch queues are resource quota limits that restrict the amounts of different resources that a batch request can consume during execution. Unique to each device queue is a set of one or more devices, such as a line printer, to which requests can be sent for execution. Pipe queues have associated destinations to which they route and deliver requests. If the proper destination machine is down or unreachable, pipe queues are able to requeue the request and deliver it later when the destination is available. All NQS network conversations are performed using the Berkeley socket mechanism as ported into the respective vendor kernels. NQS is written in C language. The generic UNIX version (ARC-13179) has been successfully implemented on a variety of UNIX platforms, including Sun3 and Sun4 series computers, SGI IRIS computers running IRIX 3.3, DEC computers running ULTRIX 4.1, AMDAHL computers running UTS 1.3 and 2.1, platforms running BSD 4.3 UNIX. The IBM RS/6000 AIX version (COS-10042) is a vendor port. NQS 2.0 will also communicate with the Cray Research, Inc. and Convex, Inc. versions of NQS. The standard distribution medium for either machine version of NQS 2.0 is a 60Mb, QIC-24, .25 inch streaming magnetic tape cartridge in UNIX tar format. Upon request the generic UNIX version (ARC-13179) can be provided in UNIX tar format on alternate media. Please contact COSMIC to discuss the availability and cost of media to meet your specific needs. An electronic copy of the NQS 2.0 documentation is included on the program media. NQS 2.0 was released in 1991. The IBM RS/6000 port of NQS was developed in 1992. IRIX is a trademark of Silicon Graphics Inc. IRIS is a registered trademark of Silicon Graphics Inc. UNIX is a registered trademark of UNIX System Laboratories Inc. Sun3 and Sun4 are trademarks of Sun Microsystems Inc. DEC and ULTRIX are trademarks of Digital Equipment Corporation.

  6. A queueing theory based model for business continuity in hospitals.

    PubMed

    Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R

    2013-01-01

    Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals. PMID:24109839

  7. An application of a queuing model for sea states

    NASA Astrophysics Data System (ADS)

    Loffredo, L.; Monbaliu, J.; Anderson, C.

    2012-04-01

    Unimodal approaches in design practice have shown inconsistencies in terms of directionality and limitations for accurate sea states description. Spectral multimodality needs to be included in the description of the wave climate. It can provide information about the coexistence of different wave systems originating from different meteorological events, such as locally generated wind waves and swell systems from distant storms. A 20 years dataset (1989-2008) for a location on the North Sea (K13, 53.2°N 3.2°E) has been retrieved from the ECMWF ERA- Interim re-analysis data archive, providing a consistent and homogeneous dataset. The work focuses on the joint and conditional probability distributions of wind sea and swell systems. For marine operations and design applications, critical combinations of wave systems may exist. We define a critical sea state on the basis of a set of thresholds, which can be not necessarily extreme, the emphasis is given to the dangerous combination of different wave systems concerning certain operations (i.e. small vessels navigation, dredging). The distribution of non-operability windows is described by a point process model with random and independent events, whose occurrences and lengths can be described only probabilistically. These characteristics allow to treat the emerging patterns as a part of a queuing system. According to this theory, generally adopted for several applications including traffic flows and waiting lines, the input process describes the sequence of requests for a service and the service mechanism the length of time that these requests will occupy the facilities. For weather-driven processes at sea an alternating renewal process appears as a suitable model. It consists of a sequence of critical events (period of inoperability), each of random duration, separated by calms, also of random durations. Inoperability periods and calms are assumed independent. In this model it is not possible more than one critical event occurring at the same time. The analysis is carried out taking into account the thresholds' selection and the seasonality.

  8. Time-varying priority queuing models for human dynamics

    NASA Astrophysics Data System (ADS)

    Jo, Hang-Hyun; Pan, Raj Kumar; Kaski, Kimmo

    2012-06-01

    Queuing models provide insight into the temporal inhomogeneity of human dynamics, characterized by the broad distribution of waiting times of individuals performing tasks. We theoretically study the queuing model of an agent trying to execute a task of interest, the priority of which may vary with time due to the agent's “state of mind.” However, its execution is disrupted by other tasks of random priorities. By considering the priority of the task of interest either decreasing or increasing algebraically in time, we analytically obtain and numerically confirm the bimodal and unimodal waiting time distributions with power-law decaying tails, respectively. These results are also compared to the updating time distribution of papers in arXiv.org and the processing time distribution of papers in Physical Review journals. Our analysis helps to understand human task execution in a more realistic scenario.

  9. Using multi-class queuing network to solve performance models of e-business sites.

    PubMed

    Zheng, Xiao-ying; Chen, De-ren

    2004-01-01

    Due to e-business's variety of customers with different navigational patterns and demands, multi-class queuing network is a natural performance model for it. The open multi-class queuing network(QN) models are based on the assumption that no service center is saturated as a result of the combined loads of all the classes. Several formulas are used to calculate performance measures, including throughput, residence time, queue length, response time and the average number of requests. The solution technique of closed multi-class QN models is an approximate mean value analysis algorithm (MVA) based on three key equations, because the exact algorithm needs huge time and space requirement. As mixed multi-class QN models, include some open and some closed classes, the open classes should be eliminated to create a closed multi-class QN so that the closed model algorithm can be applied. Some corresponding examples are given to show how to apply the algorithms mentioned in this article. These examples indicate that multi-class QN is a reasonably accurate model of e-business and can be solved efficiently. PMID:14663849

  10. Modified weighted fair queuing for packet scheduling in mobile WiMAX networks

    NASA Astrophysics Data System (ADS)

    Satrya, Gandeva B.; Brotoharsono, Tri

    2013-03-01

    The increase of user mobility and the need for data access anytime also increases the interest in broadband wireless access (BWA). The best available quality of experience for mobile data service users are assured for IEEE 802.16e based users. The main problem of assuring a high QOS value is how to allocate available resources among users in order to meet the QOS requirement for criteria such as delay, throughput, packet loss and fairness. There is no specific standard scheduling mechanism stated by IEEE standards, which leaves it for implementer differentiation. There are five QOS service classes defined by IEEE 802.16: Unsolicited Grant Scheme (UGS), Extended Real Time Polling Service (ertPS), Real Time Polling Service (rtPS), Non Real Time Polling Service (nrtPS) and Best Effort Service (BE). Each class has different QOS parameter requirements for throughput and delay/jitter constraints. This paper proposes Modified Weighted Fair Queuing (MWFQ) scheduling scenario which was based on Weighted Round Robin (WRR) and Weighted Fair Queuing (WFQ). The performance of MWFQ was assessed by using above five QoS criteria. The simulation shows that using the concept of total packet size calculation improves the network's performance.

  11. Performance evaluation for an optical hybrid switch with circuit queued reservations.

    PubMed

    Wong, Eric; Zukerman, Moshe

    2005-11-14

    We provide here a new loss model for an optical hybrid switch that can function as an optical burst switch or optical circuit switch or both simultaneously. We introduce the feature of circuit queued reservation. That is, if a circuit request arrives and cannot find a free wavelength, and if there are not too many requests queued for reservations, it may join a queue and wait until such wavelength becomes available. We first present an analysis based on a 3-dimension state-space Markov chain that provides exact results for the blocking probabilities of bursts and circuits. We also provide results for the proportion of circuits that are delayed and the mean delay of the circuits that are delay. Because it is difficult to exactly compute the blocking probability in realistic scenarios with a large number of wavelengths, we derive computationally scalable and accurate approximations which are based on reducing the 3-dimension state space into a single dimension. These scalable approximations that can produce performance results in a fraction of a second can readily enable switch dimensioning. PMID:19503147

  12. Priority queuing models for hospital intensive care units and impacts to severe case patients.

    PubMed

    Hagen, Matthew S; Jopling, Jeffrey K; Buchman, Timothy G; Lee, Eva K

    2013-01-01

    This paper examines several different queuing models for intensive care units (ICU) and the effects on wait times, utilization, return rates, mortalities, and number of patients served. Five separate intensive care units at an urban hospital are analyzed and distributions are fitted for arrivals and service durations. A system-based simulation model is built to capture all possible cases of patient flow after ICU admission. These include mortalities and returns before and after hospital exits. Patients are grouped into 9 different classes that are categorized by severity and length of stay (LOS). Each queuing model varies by the policies that are permitted and by the order the patients are admitted. The first set of models does not prioritize patients, but examines the advantages of smoothing the operating schedule for elective surgeries. The second set analyzes the differences between prioritizing admissions by expected LOS or patient severity. The last set permits early ICU discharges and conservative and aggressive bumping policies are contrasted. It was found that prioritizing patients by severity considerably reduced delays for critical cases, but also increased the average waiting time for all patients. Aggressive bumping significantly raised the return and mortality rates, but more conservative methods balance quality and efficiency with lowered wait times without serious consequences. PMID:24551379

  13. Using a segregation measure for the workload characterization of multi-class queuing networks

    SciTech Connect

    Dowdy, L.W.; Krantz, A.T.; Leuze, M.R.

    1989-01-01

    When a queuing network model of a computer system is constructed, the workload is characterized by several parameters known as the device demands. The demand that every customer places upon every device must be specified. A complete workload characterization of a K device network with N different customers contains N * K parameters. Substantial savings in complexity result if the number of workload parameters is decreased. If, for example, only the average demands on the K devices are used in the workload characterization, the overhead of parameter collection is reduced, and the solution of the queuing network model is simplified. With this approach, however, the multi-class system is represented by a single-class model. A loss of accuracy results. It has been recently demonstrated that the performance of a multi-class network is bounded below by its single-class counterpart model and is bounded above by a simple function based upon the single-class model. In this paper, a new workload characterization technique is proposed which requires: the K average device demands for the single-class counterpart model and a segregation measure, a value which indicates the degree to which different customers tend to utilize different parts of the network. The segregation measure specifies the point between the two bounds where the multi-class model's performance lies. This measure is quite intuitive and is simple to calculate. The technique provides an accurate estimate of the performance of a multi-class network. 6 refs., 5 figs., 3 tabs.

  14. Priority Queuing Models for Hospital Intensive Care Units and Impacts to Severe Case Patients

    PubMed Central

    Hagen, Matthew S.; Jopling, Jeffrey K; Buchman, Timothy G; Lee, Eva K.

    2013-01-01

    This paper examines several different queuing models for intensive care units (ICU) and the effects on wait times, utilization, return rates, mortalities, and number of patients served. Five separate intensive care units at an urban hospital are analyzed and distributions are fitted for arrivals and service durations. A system-based simulation model is built to capture all possible cases of patient flow after ICU admission. These include mortalities and returns before and after hospital exits. Patients are grouped into 9 different classes that are categorized by severity and length of stay (LOS). Each queuing model varies by the policies that are permitted and by the order the patients are admitted. The first set of models does not prioritize patients, but examines the advantages of smoothing the operating schedule for elective surgeries. The second set analyzes the differences between prioritizing admissions by expected LOS or patient severity. The last set permits early ICU discharges and conservative and aggressive bumping policies are contrasted. It was found that prioritizing patients by severity considerably reduced delays for critical cases, but also increased the average waiting time for all patients. Aggressive bumping significantly raised the return and mortality rates, but more conservative methods balance quality and efficiency with lowered wait times without serious consequences. PMID:24551379

  15. NAS Requirements Checklist for Job Queuing/Scheduling Software

    NASA Technical Reports Server (NTRS)

    Jones, James Patton

    1996-01-01

    The increasing reliability of parallel systems and clusters of computers has resulted in these systems becoming more attractive for true production workloads. Today, the primary obstacle to production use of clusters of computers is the lack of a functional and robust Job Management System for parallel applications. This document provides a checklist of NAS requirements for job queuing and scheduling in order to make most efficient use of parallel systems and clusters for parallel applications. Future requirements are also identified to assist software vendors with design planning.

  16. Based on Regular Expression Matching of Evaluation of the Task Performance in WSN: A Queue Theory Approach

    PubMed Central

    Cui, Kai; Zhou, Kuanjiu; Yu, Yanshuo

    2014-01-01

    Due to the limited resources of wireless sensor network, low efficiency of real-time communication scheduling, poor safety defects, and so forth, a queuing performance evaluation approach based on regular expression match is proposed, which is a method that consists of matching preprocessing phase, validation phase, and queuing model of performance evaluation phase. Firstly, the subset of related sequence is generated in preprocessing phase, guiding the validation phase distributed matching. Secondly, in the validation phase, the subset of features clustering, the compressed matching table is more convenient for distributed parallel matching. Finally, based on the queuing model, the sensor networks of task scheduling dynamic performance are evaluated. Experiments show that our approach ensures accurate matching and computational efficiency of more than 70%; it not only effectively detects data packets and access control, but also uses queuing method to determine the parameters of task scheduling in wireless sensor networks. The method for medium scale or large scale distributed wireless node has a good applicability. PMID:25401151

  17. A message-queuing framework for STAR's online monitoring and metadata collection

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.; Betts, W.

    2011-12-01

    We report our experience on migrating STARs Online Services (Run Control System, Data Acquisition System, Slow Control System and Subsystem Monitoring) from direct read/write database accesses to a modern non-blocking message-oriented infrastructure. Based on the Advanced Messaging Queuing Protocol (AMQP) and standards, this novel approach does not specify the message data structure, allowing great flexibility in its use. After careful consideration, we chose Google Protocol Buffers as our primary (de)serialization format for structured data exchange. This migration allows us to reduce the overall system complexity and greatly improve the reliability of the metadata collection and the performance of our online services in general. We will present this new framework through its software architecture overview, providing details about our staged and non-disruptive migration process as well as details of the implementation of pluggable components to provide future improvements without compromising stability and availability of services.

  18. Design and development of cell queuing, processing, and scheduling modules for the iPOINT input-buffered ATM testbed

    NASA Astrophysics Data System (ADS)

    Duan, Haoran

    1997-12-01

    This dissertation presents the concepts, principles, performance, and implementation of input queuing and cell-scheduling modules for the Illinois Pulsar-based Optical INTerconnect (iPOINT) input-buffered Asynchronous Transfer Mode (ATM) testbed. Input queuing (IQ) ATM switches are well suited to meet the requirements of current and future ultra-broadband ATM networks. The IQ structure imposes minimum memory bandwidth requirements for cell buffering, tolerates bursty traffic, and utilizes memory efficiently for multicast traffic. The lack of efficient cell queuing and scheduling solutions has been a major barrier to build high-performance, scalable IQ-based ATM switches. This dissertation proposes a new Three-Dimensional Queue (3DQ) and a novel Matrix Unit Cell Scheduler (MUCS) to remove this barrier. 3DQ uses a linked-list architecture based on Synchronous Random Access Memory (SRAM) to combine the individual advantages of per-virtual-circuit (per-VC) queuing, priority queuing, and N-destination queuing. It avoids Head of Line (HOL) blocking and provides per-VC Quality of Service (QoS) enforcement mechanisms. Computer simulation results verify the QoS capabilities of 3DQ. For multicast traffic, 3DQ provides efficient usage of cell buffering memory by storing multicast cells only once. Further, the multicast mechanism of 3DQ prevents a congested destination port from blocking other less- loaded ports. The 3DQ principle has been prototyped in the Illinois Input Queue (iiQueue) module. Using Field Programmable Gate Array (FPGA) devices, SRAM modules, and integrated on a Printed Circuit Board (PCB), iiQueue can process incoming traffic at 800 Mb/s. Using faster circuit technology, the same design is expected to operate at the OC-48 rate (2.5 Gb/s). MUCS resolves the output contention by evaluating the weight index of each candidate and selecting the heaviest. It achieves near-optimal scheduling and has a very short response time. The algorithm originates from a heuristic strategy that leads to 'socially optimal' solutions, yielding a maximum number of contention-free cells being scheduled. A novel mixed digital-analog circuit has been designed to implement the MUCS core functionality. The MUCS circuit maps the cell scheduling computation to the capacitor charging and discharging procedures that are conducted fully in parallel. The design has a uniform circuit structure, low interconnect counts, and low chip I/O counts. Using 2 ?m CMOS technology, the design operates on a 100 MHz clock and finds a near-optimal solution within a linear processing time. The circuit has been verified at the transistor level by HSPICE simulation. During this research, a five-port IQ-based optoelectronic iPOINT ATM switch has been developed and demonstrated. It has been fully functional with an aggregate throughput of 800 Mb/s. The second-generation IQ-based switch is currently under development. Equipped with iiQueue modules and MUCS module, the new switch system will deliver a multi-gigabit aggregate throughput, eliminate HOL blocking, provide per-VC QoS, and achieve near-100% link bandwidth utilization. Complete documentation of input modules and trunk module for the existing testbed, and complete documentation of 3DQ, iiQueue, and MUCS for the second-generation testbed are given in this dissertation.

  19. Theory-Based Evaluation: Reflections Ten Years On. Theory-Based Evaluation: Past, Present, and Future

    ERIC Educational Resources Information Center

    Rogers, Patricia J.; Weiss, Carol H.

    2007-01-01

    This chapter begins with a brief introduction by Rogers, in which she highlights the continued salience of Carol Weiss's decade-old questions about theory-based evaluation. Theory-based evaluation has developed significantly since Carol Weiss's chapter was first published ten years ago. In 1997 Weiss pointed to theory-based evaluation being mostly…

  20. Non-Equilibrium Statistical Physics of Currents in Network Queuing

    NASA Astrophysics Data System (ADS)

    Chertkov, Michael; Chernyak, Vladimir; Goldberg, David; Turitsyn, Konstantin

    2010-03-01

    We present a framework for studying large deviations of currents in a queuing network viewed as a non-equilibrium system of interacting particles. The network is completely specified by its underlying graphical structure, the number of servers (type of interaction) at each node, and the Poisson transition rates between nodes/stations. We focus on analyzing the statistics of currents over the network for the class of stable (statistically steady) networks. Some of our results are general (and surprising) explicit statements and some are conjectures, validated on a network with feedback which allows an independent spectral analysis. In particular, we show that for sufficiently strong atypical currents the system experiences a dynamical transition into a ``congested" regime, characterized by the saturation of certain servers in the network. We also discuss possible applications of these results for the analysis and control of traffic flows in transportation networks and of scheduling power flows in electric grids.

  1. Evaluation of Job Queuing/Scheduling Software: Phase I Report

    NASA Technical Reports Server (NTRS)

    Jones, James Patton

    1996-01-01

    The recent proliferation of high performance work stations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, the national Aerodynamic Simulation (NAS) supercomputer facility compiled a requirements checklist for job queuing/scheduling software. Next, NAS began an evaluation of the leading job management system (JMS) software packages against the checklist. This report describes the three-phase evaluation process, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still insufficient, even in the leading JMS's. However, by ranking each JMS evaluated against the requirements, we provide data that will be useful to other sites in selecting a JMS.

  2. Second Evaluation of Job Queuing/Scheduling Software. Phase 1

    NASA Technical Reports Server (NTRS)

    Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)

    1997-01-01

    The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.

  3. Computer Simulation of a Queuing System in a Mathematical Modeling Course.

    ERIC Educational Resources Information Center

    Eyob, Ephrem

    1990-01-01

    The results of a simulation model of a queuing system are reported. Use in an introductory quantitative analysis course to enhance students' computer and quantitative modeling knowledge is described. (CW)

  4. On the design of optical buffer for optical input-queued switches with quality of service guarantees

    NASA Astrophysics Data System (ADS)

    Lin, Chun-Yuan; Chen, Chien

    2005-02-01

    This paper presents a quality of service (QoS) enable optical delay line (ODL) architecture to solve the problem of resource contention and support multilevel priority queues in an optical packet switch. ODL has been used in optical packet switches to resolve resource contention; however, the packets travel continuously in ODL limits the management of random access of the packets and increases the packet loss probability. Moreover, multiple ODL sets usually are needed to realize multiple priority queues in order to support QoS. In this paper, a new Unicast Recirculatiion ODL (URODL) architecture is proposed to resolve the output contention problem in an input-queued optical packet switch. To improve relatively poor throughput due to the head of Line (HOL) blocking in the input-queued switch, we modify URODL to support a more efficient window-based lookahead scheduling algorithm. Furthermore, a control strategy is designed to turn a single set of URODL into multiple logical queues to hold different priority packets. The simulation results show our URODL model reduces packet loss effectively with the capability to support QoS. This URODL model can be easily implemented and managed in a fast optical packet switch.

  5. Basing quantum theory on information processing

    NASA Astrophysics Data System (ADS)

    Barnum, Howard

    2008-03-01

    I consider information-based derivations of the quantum formalism, in a framework encompassing quantum and classical theory and a broad spectrum of theories serving as foils to them. The most ambitious hope for such a derivation is a role analogous to Einstein's development of the dynamics and kinetics of macroscopic bodies, and later of their gravitational interactions, on the basis of simple principles with clear operational meanings and experimental consequences. Short of this, it could still provide a principled understanding of the features of quantum mechanics that account for its greater-than-classical information-processing power, helping guide the search for new quantum algorithms and protocols. I summarize the convex operational framework for theories, and discuss information-processing in theories therein. Results include the fact that information that can be obtained without disturbance is inherently classical, generalized no-cloning and no-broadcasting theorems, exponentially secure bit commitment in all non-classical theories without entanglement, properties of theories that allow teleportation, and properties of theories that allow ``remote steering'' of ensembles using entanglement. Joint work with collaborators including Jonathan Barrett, Matthew Leifer, Alexander Wilce, Oscar Dahlsten, and Ben Toner.

  6. Spectrally queued feature selection for robotic visual odometery

    NASA Astrophysics Data System (ADS)

    Pirozzo, David M.; Frederick, Philip A.; Hunt, Shawn; Theisen, Bernard; Del Rose, Mike

    2011-01-01

    Over the last two decades, research in Unmanned Vehicles (UV) has rapidly progressed and become more influenced by the field of biological sciences. Researchers have been investigating mechanical aspects of varying species to improve UV air and ground intrinsic mobility, they have been exploring the computational aspects of the brain for the development of pattern recognition and decision algorithms and they have been exploring perception capabilities of numerous animals and insects. This paper describes a 3 month exploratory applied research effort performed at the US ARMY Research, Development and Engineering Command's (RDECOM) Tank Automotive Research, Development and Engineering Center (TARDEC) in the area of biologically inspired spectrally augmented feature selection for robotic visual odometry. The motivation for this applied research was to develop a feasibility analysis on multi-spectrally queued feature selection, with improved temporal stability, for the purposes of visual odometry. The intended application is future semi-autonomous Unmanned Ground Vehicle (UGV) control as the richness of data sets required to enable human like behavior in these systems has yet to be defined.

  7. Queuing transitions in the asymmetric simple exclusion process.

    PubMed

    Ha, Meesoon; Timonen, Jussi; den Nijs, Marcel

    2003-11-01

    Stochastic driven flow along a channel can be modeled by the asymmetric simple exclusion process. We confirm numerically the presence of a dynamic queuing phase transition at a nonzero obstruction strength, and establish its scaling properties. Below the transition, the traffic jam is macroscopic in the sense that the length of the queue scales linearly with system size. Above the transition, only a power-law shaped queue remains. Its density profile scales as deltarho approximately x(-nu) with nu=1/3, and x is the distance from the obstacle. We construct a heuristic argument, indicating that the exponent nu=1/3 is universal and independent of the dynamic exponent of the underlying dynamic process. Fast bonds create only power-law shaped depletion queues, and with an exponent that could be equal to nu=2/3, but the numerical results yield consistently somewhat smaller values nu approximately 0.63(3). The implications of these results to faceting of growing interfaces and localization of directed polymers in random media, both in the presence of a columnar defect are pointed out as well. PMID:14682861

  8. Application of queuing models to electronic toll collection

    NASA Astrophysics Data System (ADS)

    Zarrillo, Marguerite L.; Radwan, A. E.; Al-Deek, H. M.

    1998-01-01

    Electronic Toll Collection (ETC) via Automatic Vehicle Identification (AVI) technology has significantly altered traffic operations during toll collection. In particular, the value of the average processing rate of a lane providing both ETC service as well as a traditional service, fluctuates over the rush hour between the average value of the processing rate of the traditional service and the capacity of the ETC service. This study develops a queuing model to address the changing processing rates for the different mixed lanes. The model is applied to the westbound 9-lane portion of the Holland East Plaza in Orlando, FLorida. Data is evaluated for 6 different rush hours that include 3 different configuration patterns implemented over a period of 3 years. In the first configuration, only the traditional toll collection services are provided. In another configuration, all traditional lanes become mixed to include ETC except for the center lane, which becomes a lane dedicated solely to ETC service. In a final configuration, two lanes become dedicated to ETC service.

  9. Jigsaw Cooperative Learning: Acid-Base Theories

    ERIC Educational Resources Information Center

    Tarhan, Leman; Sesen, Burcin Acar

    2012-01-01

    This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…

  10. Jigsaw Cooperative Learning: Acid-Base Theories

    ERIC Educational Resources Information Center

    Tarhan, Leman; Sesen, Burcin Acar

    2012-01-01

    This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…

  11. Evacuation time estimate for total pedestrian evacuation using a queuing network model and volunteered geographic information

    NASA Astrophysics Data System (ADS)

    Kunwar, Bharat; Simini, Filippo; Johansson, Anders

    2016-02-01

    Estimating city evacuation time is a nontrivial problem due to the interaction between thousands of individual agents, giving rise to various collective phenomena, such as bottleneck formation, intermittent flow, and stop-and-go waves. We present a mean field approach to draw relationships between road network spatial attributes, the number of evacuees, and the resultant evacuation time estimate (ETE). Using volunteered geographic information, we divide 50 United Kingdom cities into a total of 704 catchment areas (CAs) which we define as an area where all agents share the same nearest exit node. 90% of the agents are within ≈6 ,847 m of CA exit nodes with ≈13 ,778 agents/CA. We establish a characteristic flow rate from catchment area attributes (population, distance to exit node, and exit node width) and a mean flow rate in a free-flow regime by simulating total evacuations using an agent based "queuing network" model. We use these variables to determine a relationship between catchment area attributes and resultant ETEs. This relationship could enable emergency planners to make a rapid appraisal of evacuation strategies and help support decisions in the run up to a crisis.

  12. Modelling Pedestrian Travel Time and the Design of Facilities: A Queuing Approach

    PubMed Central

    Rahman, Khalidur; Abdul Ghani, Noraida; Abdulbasah Kamil, Anton; Mustafa, Adli; Kabir Chowdhury, Md. Ahmed

    2013-01-01

    Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities. PMID:23691055

  13. The Scope of Usage-Based Theory

    PubMed Central

    Ibbotson, Paul

    2013-01-01

    Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the “cognitive commitment” of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works. PMID:23658552

  14. Flocculation control study based on fractal theory*

    PubMed Central

    Chang, Ying; Liu, Qian-jun; Zhang, Jin-song

    2005-01-01

    A study on flocculation control based on fractal theory was carried out. Optimization test of chemical coagulant dosage confirmed that the fractal dimension could reflect the flocculation degree and settling characteristics of aggregates and the good correlation with the turbidity of settled effluent. So that the fractal dimension can be used as the major parameter for flocculation system control and achieve self-acting adjustment of chemical coagulant dosage. The fractal dimension flocculation control system was used for further study carried out on the effects of various flocculation parameters, among which are the dependency relationship among aggregates fractal dimension, chemical coagulant dosage, and turbidity of settled effluent under the conditions of variable water quality and quantity. And basic experimental data were obtained for establishing the chemical coagulant dosage control model mainly based on aggregates fractal dimension. PMID:16187420

  15. Towards a Faith-Based Program Theory: A Reconceptualization of Program Theory

    ERIC Educational Resources Information Center

    Harden, Mark G.

    2006-01-01

    A meta-program theory is proposed to overcome the limitations and improve the use of program theory as an approach to faith-based program evaluation. The essentials for understanding religious organizations, their various programs, and faith and spirituality are discussed to support a rationale for developing a faith-based program theory that…

  16. MODELING AND PERFORMANCE EVALUATION FOR AVIATION SECURITY CARGO INSPECTION QUEUING SYSTEM

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Rose, Terri A; Brumback, Daryl L

    2009-01-01

    Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and throughput. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures will reduce the overall cost and shipping delays associated with the new inspection requirements.

  17. Discrete-time Queuing Analysis of Opportunistic Spectrum Access: Single User Case

    NASA Astrophysics Data System (ADS)

    Wang, Jin-long; Xu, Yu-hua; Gao, Zhan; Wu, Qi-hui

    2011-11-01

    This article studies the discrete-time queuing dynamics of opportunistic spectrum access (OSA) systems, in which the secondary user seeks spectrum vacancies between bursty transmissions of the primary user to communicate. Since spectrum sensing and data transmission can not be performed simultaneously, the secondary user employs a sensing-then-transmission strategy to detect the presence of the primary user before accessing the licensed channel. Consequently, the transmission of the secondary user is periodically suspended for spectrum sensing. To capture the discontinuous transmission nature of the secondary user, we introduce a discrete-time queuing subjected to bursty preemption to describe the behavior of the secondary user. Specifically, we derive some important metrics of the secondary user, including secondary spectrum utilization ratio, buffer length, packet delay and packet dropping ratio. Finally, simulation results validate the proposed theoretical model and reveal that the theoretical results fit the simulated results well.

  18. A Multiple Constraint Queuing Model for Predicting Current and Future Terminal Area Capacities

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2004-01-01

    A new queuing model is being developed to evaluate the capacity benefits of several new concepts for terminal airspace operations. The major innovation is the ability to support a wide variety of multiple constraints for modeling the scheduling logic of several concepts. Among the constraints modeled are in-trail separation, separation between aircraft landing on parallel runways, in-trail separation at terminal area entry points, and permissible terminal area flight times.

  19. Measured effects of user and clinical engineer training using a queuing model.

    PubMed

    Cruz, A Miguel; Rodríguez Denis, E; Sánchez Villar, C; Pozo Puñales, E T; Vergara Perez, I

    2003-01-01

    This article puts forward a new proposal to calculate count, turnaround, response, and service time of work orders in a clinical engineering (CE) department. These are calculated by means of a queuing model as a measurement tool. This proposal was tested in a 600-bed hospital with an inventory of 1094 medical devices and with 6 full-time clinical engineers. In April 1999, a simulation (with ARENA 3.01 developed by System Modeling Corporation) of the working of this proposal was performed with desired values being applied to the queuing model. At the end of 2002, real work order data from the database was recorded. As predicted, the results showed that all the indicators of nonscheduled work orders decreased. Response and turnaround time were reduced from 27 to 0.56 hours and 27.48 to 1.13 hours, respectively. From a backlog of 22 outstanding repair orders per month between April 1999 and January 2000, the number was reduced to 4 in December 2002. The queuing model also helped to measure the positive effects on arrival and service rates when users and CE were trained. The difference between simulated and real values was under 5%. PMID:14699735

  20. Assessing the Queuing Process Using Data Envelopment Analysis: an Application in Health Centres.

    PubMed

    Safdar, Komal A; Emrouznejad, Ali; Dey, Prasanta K

    2016-01-01

    Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients' department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages. PMID:26558394

  1. An Improved Call Admission Control Mechanism with Prioritized Handoff Queuing Scheme for BWA Networks

    NASA Astrophysics Data System (ADS)

    Chowdhury, Prasun; Saha Misra, Iti

    2014-10-01

    Nowadays, due to increased demand for using the Broadband Wireless Access (BWA) networks in a satisfactory manner a promised Quality of Service (QoS) is required to manage the seamless transmission of the heterogeneous handoff calls. To this end, this paper proposes an improved Call Admission Control (CAC) mechanism with prioritized handoff queuing scheme that aims to reduce dropping probability of handoff calls. Handoff calls are queued when no bandwidth is available even after the allowable bandwidth degradation of the ongoing calls and get admitted into the network when an ongoing call is terminated with a higher priority than the newly originated call. An analytical Markov model for the proposed CAC mechanism is developed to analyze various performance parameters. Analytical results show that our proposed CAC with handoff queuing scheme prioritizes the handoff calls effectively and reduces dropping probability of the system by 78.57% for real-time traffic without degrading the number of failed new call attempts. This results in the increased bandwidth utilization of the network.

  2. Feature-Based Binding and Phase Theory

    ERIC Educational Resources Information Center

    Antonenko, Andrei

    2012-01-01

    Current theories of binding cannot provide a uniform account for many facts associated with the distribution of anaphors, such as long-distance binding effects and the subject-orientation of monomorphemic anaphors. Further, traditional binding theory is incompatible with minimalist assumptions. In this dissertation I propose an analysis of…

  3. Theory Based Approaches to Learning. Implications for Adult Educators.

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; Jones, Edward V.

    This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…

  4. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  5. Continuing Bonds in Bereavement: An Attachment Theory Based Perspective

    ERIC Educational Resources Information Center

    Field, Nigel P.; Gao, Beryl; Paderna, Lisa

    2005-01-01

    An attachment theory based perspective on the continuing bond to the deceased (CB) is proposed. The value of attachment theory in specifying the normative course of CB expression and in identifying adaptive versus maladaptive variants of CB expression based on their deviation from this normative course is outlined. The role of individual…

  6. A Theory-Based Computer Tutorial Model.

    ERIC Educational Resources Information Center

    Dixon, Robert C.; Clapp, Elizabeth J.

    Because of the need for models to illustrate some possible answers to practical courseware development questions, a specific, three-section model incorporating the Corrective Feedback Paradigm (PCP) is advanced for applying theory to courseware. The model is reconstructed feature-by-feature against a framework of a hypothetical, one-to-one,…

  7. Recursive renormalization group theory based subgrid modeling

    NASA Technical Reports Server (NTRS)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  8. Maximum entropy principle based estimation of performance distribution in queueing theory.

    PubMed

    He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

    2014-01-01

    In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated. PMID:25207992

  9. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    PubMed

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms. PMID:24283669

  10. Theory of friction based on brittle fracture

    USGS Publications Warehouse

    Byerlee, J.D.

    1967-01-01

    A theory of friction is presented that may be more applicable to geologic materials than the classic Bowden and Tabor theory. In the model, surfaces touch at the peaks of asperities and sliding occurs when the asperities fail by brittle fracture. The coefficient of friction, ??, was calculated from the strength of asperities of certain ideal shapes; for cone-shaped asperities, ?? is about 0.1 and for wedge-shaped asperities, ?? is about 0.15. For actual situations which seem close to the ideal model, observed ?? was found to be very close to 0.1, even for materials such as quartz and calcite with widely differing strengths. If surface forces are present, the theory predicts that ?? should decrease with load and that it should be higher in a vacuum than in air. In the presence of a fluid film between sliding surfaces, ?? should depend on the area of the surfaces in contact. Both effects are observed. The character of wear particles produced during sliding and the way in which ?? depends on normal load, roughness, and environment lend further support to the model of friction presented here. ?? 1967 The American Institute of Physics.

  11. Aviation security cargo inspection queuing simulation model for material flow and accountability

    SciTech Connect

    Olama, Mohammed M; Allgood, Glenn O; Rose, Terri A; Brumback, Daryl L

    2009-01-01

    Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we develop an aviation security cargo inspection queuing simulation model for material flow and accountability that will allow cargo managers to conduct impact studies of current and proposed business practices as they relate to inspection procedures, material flow, and accountability.

  12. Aviation security cargo inspection queuing simulation model for material flow and accountability

    NASA Astrophysics Data System (ADS)

    Allgood, Glenn O.; Olama, Mohammed M.; Rose, Terri A.; Brumback, Daryl

    2009-05-01

    Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we develop an aviation security cargo inspection queuing simulation model for material flow and accountability that will allow cargo managers to conduct impact studies of current and proposed business practices as they relate to inspection procedures, material flow, and accountability.

  13. Task-Based Language Teaching and Expansive Learning Theory

    ERIC Educational Resources Information Center

    Robertson, Margaret

    2014-01-01

    Task-Based Language Teaching (TBLT) has become increasingly recognized as an effective pedagogy, but its location in generalized sociocultural theories of learning has led to misunderstandings and criticism. The purpose of this article is to explain the congruence between TBLT and Expansive Learning Theory and the benefits of doing so. The merit…

  14. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  15. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  16. Unifying ecology and macroevolution with individual-based theory

    PubMed Central

    Rosindell, James; Harmon, Luke J; Etienne, Rampal S

    2015-01-01

    A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology. PMID:25818618

  17. Unifying ecology and macroevolution with individual-based theory.

    PubMed

    Rosindell, James; Harmon, Luke J; Etienne, Rampal S

    2015-05-01

    A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology. PMID:25818618

  18. The bases of effective field theories

    NASA Astrophysics Data System (ADS)

    Einhorn, Martin B.; Wudka, José

    2013-11-01

    With reference to the equivalence theorem, we discuss the selection of basis operators for effective field theories in general. The equivalence relation can be used to partition operators into equivalence classes, from which inequivalent basis operators are selected. These classes can also be identified as containing Potential-Tree-Generated (PTG) operators, Loop-Generated (LG) operators, or both, independently of the specific dynamics of the underlying extended models, so long as it is perturbatively decoupling. For an equivalence class containing both, we argue that the basis operator should be chosen from among the PTG operators, because they may have the largest coefficients. We apply this classification scheme to dimension-six operators in an illustrative Yukawa model as well in the Standard Model (SM). We show that the basis chosen by Grzadkowski et al. [5] for the SM satisfies this criterion. In this light, we also revisit and verify our earlier result [6] that the dimension-six corrections to the triple-gauge-boson couplings only arise from LG operators, so the magnitude of the coefficients should only be a few parts per thousand of the SM gauge coupling if BSM dynamics respects decoupling. The same is true of the quartic-gauge-boson couplings.

  19. Modeling the emergency cardiac in-patient flow: an application of queuing theory.

    PubMed

    de Bruin, Arnoud M; van Rossum, A C; Visser, M C; Koole, G M

    2007-06-01

    This study investigates the bottlenecks in the emergency care chain of cardiac in-patient flow. The primary goal is to determine the optimal bed allocation over the care chain given a maximum number of refused admissions. Another objective is to provide deeper insight in the relation between natural variation in arrivals and length of stay and occupancy rates. The strong focus on raising occupancy rates of hospital management is unrealistic and counterproductive. Economies of scale cannot be neglected. An important result is that refused admissions at the First Cardiac Aid (FCA) are primarily caused by unavailability of beds downstream the care chain. Both variability in LOS and fluctuations in arrivals result in large workload variations. Techniques from operations research were successfully used to describe the complexity and dynamics of emergency in-patient flow. PMID:17608054

  20. Congestion at Card and Book Catalogs--A Queuing Theory Approach.

    ERIC Educational Resources Information Center

    Bookstein, Abraham

    The question of whether a library's catalog should consist of cards arranged in a single alphabetical order (the "dictionary catalog) or be segregated as a separate file is discussed. Development is extended to encompass related problems involved in the creation of a book catalog. A model to study the effects of congestion at the catalog is…

  1. Research on the Optimization Method of Maintenance Support Unit Configuration with Queuing Theory

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Xu, Ying; Dong, Yue; Hou, Na; Yu, Yongli

    Beginning with the conception of maintenance support unit, the maintenance support flow is analyzed, so as to confirm the relation between damaged equipment mean-time-to-repair and the number of maintenance support units. On that basis, the maintenance support unit configuration optimization model is found aiming at the minimum cost of maintenance support resources, of which the solution is given. And the process is explained at the last with a example.

  2. Theory construction based on standards of care: a proposed theory of the peaceful end of life.

    PubMed

    Ruland, C M; Moore, S M

    1998-01-01

    The contribution of developing a theory from this standard of care is that it can express a new unifying idea about the phenomenon of peaceful end of life for terminally ill patients. It allows for generating and testing hypotheses that can provide new insights into the nature of this phenomenon and can contribute to increased knowledge about nursing interventions that help patients toward a peaceful end of life. The process of theory development from standards of care as described in this article also can be applied to other phenomena. Clinical practice abounds with opportunities for theory development, yet nurses often do not use theories to guide their practice. Until now, little guidance has been provided to tap the richness of clinical knowledge for the development of middle-range theories. Whereas the method described in this article may still be further refined, it offers a promising approach for the development of theories that are applicable to practice and move beyond the scope of grand theories. Thus deriving theories from standards of care can offer an important contribution to the development of the discipline's scientific knowledge base and enhanced practice. PMID:9739534

  3. Complexity measurement based on information theory and kolmogorov complexity.

    PubMed

    Lui, Leong Ting; Terrazas, Germán; Zenil, Hector; Alexander, Cameron; Krasnogor, Natalio

    2015-01-01

    In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules. PMID:25622014

  4. Improved virtual queuing and dynamic EPD techniques for TCP over ATM

    SciTech Connect

    Wu, Y.; Siu, K.Y.; Ren, W.

    1998-11-01

    It is known that TCP throughput can degrade significantly over UBR service in a congested ATM network, and the early packet discard (EPD) technique has been proposed to improve the performance. However, recent studies show that EPD cannot ensure fairness among competing VCs in a congested network, but the degree of fairness can be improved using various forms of fair buffer allocation techniques. The authors propose an improved scheme that utilizes only a single shared FIFO queue for all VCs and admits simple implementation for high speed ATM networks. The scheme achieves nearly perfect fairness and throughput among multiple TCP connections, comparable to the expensive per-VC queuing technique. Analytical and simulation results are presented to show the validity of this new scheme and significant improvement in performance as compared with existing fair buffer allocation techniques for TCP over ATM.

  5. [Application of a calling and queuing system in blood sampling in the clinical laboratory].

    PubMed

    Yang, Da-Gan; Guo, Xi-Chao; Xu, Gen-Yun; Chen, Yu

    2008-03-01

    This paper introduces the application of a calling and queuing system for blood sample collection in a large hospital in China. Besides the basic function, it has following functions. (a) A real name system: get the number according to the laboratory application form to prevent the phenomena of buying a number and an empty number. (b) Two times waiting: the patient should wait at the main hall, then at the blood sampling window so as to improve the work efficiency. (c) The flowchart for an outpatient blood testing is as following: getting the number --> waiting --> blood sampling --> getting the test information report. This system is capable of not only optimizing the work flow, but also improving the clinical environment. It shortens the patient's waiting time and raises the laboratory quality as well. PMID:18581883

  6. Modeling Air Traffic Management Technologies with a Queuing Network Model of the National Airspace System

    NASA Technical Reports Server (NTRS)

    Long, Dou; Lee, David; Johnson, Jesse; Gaier, Eric; Kostiuk, Peter

    1999-01-01

    This report describes an integrated model of air traffic management (ATM) tools under development in two National Aeronautics and Space Administration (NASA) programs -Terminal Area Productivity (TAP) and Advanced Air Transport Technologies (AATT). The model is made by adjusting parameters of LMINET, a queuing network model of the National Airspace System (NAS), which the Logistics Management Institute (LMI) developed for NASA. Operating LMINET with models of various combinations of TAP and AATT will give quantitative information about the effects of the tools on operations of the NAS. The costs of delays under different scenarios are calculated. An extension of Air Carrier Investment Model (ACIM) under ASAC developed by the Institute for NASA maps the technologies' impacts on NASA operations into cross-comparable benefits estimates for technologies and sets of technologies.

  7. Project-Based Language Learning: An Activity Theory Analysis

    ERIC Educational Resources Information Center

    Gibbes, Marina; Carson, Lorna

    2014-01-01

    This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…

  8. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  9. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  10. A Natural Teaching Method Based on Learning Theory.

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    1991-01-01

    The natural teaching method is active and student-centered, based on schema and constructivist theories, and informed by research in neuroplasticity. A schema is a mental picture or understanding of something we have learned. Humans can have knowledge only to the degree to which they have constructed schemas from learning experiences and practice.…

  11. A Memory-Based Theory of Verbal Cognition

    ERIC Educational Resources Information Center

    Dennis, Simon

    2005-01-01

    The syntagmatic paradigmatic model is a distributed, memory-based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long-term memory and the resolution of these…

  12. A Model of Statistics Performance Based on Achievement Goal Theory.

    ERIC Educational Resources Information Center

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  13. Toward an Instructionally Oriented Theory of Example-Based Learning

    ERIC Educational Resources Information Center

    Renkl, Alexander

    2014-01-01

    Learning from examples is a very effective means of initial cognitive skill acquisition. There is an enormous body of research on the specifics of this learning method. This article presents an instructionally oriented theory of example-based learning that integrates theoretical assumptions and findings from three research areas: learning from…

  14. Nano-resonator frequency response based on strain gradient theory

    NASA Astrophysics Data System (ADS)

    Maani Miandoab, Ehsan; Yousefi-Koma, Aghil; Nejat Pishkenari, Hossein; Fathi, Mohammad

    2014-09-01

    This paper aims to explore the dynamic behaviour of a nano-resonator under ac and dc excitation using strain gradient theory. To achieve this goal, the partial differential equation of nano-beam vibration is first converted to an ordinary differential equation by the Galerkin projection method and the lumped model is derived. Lumped parameters of the nano-resonator, such as linear and nonlinear springs and damper coefficients, are compared with those of classical theory and it is demonstrated that beams with smaller thickness display greater deviation from classical parameters. Stable and unstable equilibrium points based on classic and non-classical theories are also compared. The results show that, regarding the applied dc voltage, the dynamic behaviours expected by classical and non-classical theories are significantly different, such that one theory predicts the un-deformed shape as the stable condition, while the other theory predicts that the beam will experience bi-stability. To obtain the frequency response of the nano-resonator, a general equation including cubic and quadratic nonlinearities in addition to parametric electrostatic excitation terms is derived, and the analytical solution is determined using a second-order multiple scales method. Based on frequency response analysis, the softening and hardening effects given by two theories are investigated and compared, and it is observed that neglecting the size effect can lead to two completely different predictions in the dynamic behaviour of the resonators. The findings of this article can be helpful in the design and characterization of the size-dependent dynamic behaviour of resonators on small scales.

  15. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  16. Innovating Method of Existing Mechanical Product Based on TRIZ Theory

    NASA Astrophysics Data System (ADS)

    Zhao, Cunyou; Shi, Dongyan; Wu, Han

    Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.

  17. Evidence for an expectancy-based theory of avoidance behaviour.

    PubMed

    Declercq, Mieke; De Houwer, Jan; Baeyens, Frank

    2008-01-01

    In most studies on avoidance learning, participants receive an aversive unconditioned stimulus after a warning signal is presented, unless the participant performs a particular response. Lovibond (2006) recently proposed a cognitive theory of avoidance learning, according to which avoidance behaviour is a function of both Pavlovian and instrumental conditioning. In line with this theory, we found that avoidance behaviour was based on an integration of acquired knowledge about, on the one hand, the relation between stimuli and, on the other hand, the relation between behaviour and stimuli. PMID:18609382

  18. Generalizations of Gravitational Theory Based on Group Covariance

    NASA Astrophysics Data System (ADS)

    Halpern, Leopold

    1982-10-01

    The mathematical structure, the field equations, and fundamentals of the kinematics of generalizations of general relativity based on semisimple invariance groups are presented. The structure is that of a generalized Kaluza-Klein theory with a subgroup as the gauge group. The group manifold with its Cartan-Killing metric forms the source-free solution. The gauge fields do not vanish even in this case and give rise to additional modes of free motion. The case of the de Sitter groups is presented as an example where the gauge field is tentatively assumed to mediate a spin interaction and give rise to spin motion. Generalization to the conformal group and a theory yielding features of Dirac's large-number hypothesis are discussed. The possibility of further generalizations to include fermions are pointed out. The Kaluza-Klein theory is formulated in terms of principal fibre bundles which need not to be trivial.

  19. Two-scale mechanism-based theory of nonlinear viscoelasticity

    NASA Astrophysics Data System (ADS)

    Tang, Shan; Steven Greene, M.; Liu, Wing Kam

    2012-02-01

    The paper presents a mechanism-based two-scale theory for a generalized nonlinear viscoelastic continuum. The continuum is labeled as generalized since it contains extra degrees of freedom typical of past high-order continuum theories, though a new formulation is presented here tailored to meet the needs of the physical description of the viscoelastic solid. The microstress that appears in the equations, often criticized for a lack of physical meaning, is assigned in this work to viscous free chains superimposed on a nonlinear elastic backbone composed of crosslinks and reinforcement. Mathematically, hyperelasticity is used to describe the equilibrium backbone (macroscale), and an improvement of tube models for reptation dynamics describes the free chain motion at the microscale. Inhomogeneous deformation is described by inclusion of a microstrain gradient into the formulation. Thus, the theory is nicely suited for materials with microstructure where localization of strains and inhomogeneous deformation occur in addition to viscoelastic damping mechanisms due to free chains. Besides the microstress, physical meaning of the additional boundary conditions arising in the general theory is also presented. Since the proposed material model is mechanism-based, macroscopic performances are functions of microstructural variables describing the polymer chemistry so that parametric material design concepts may be gleaned from the model. Several physical phenomena are captured through numerical simulation of the class of materials of interest: size effects, strain localization, and the fracture process. Results agree qualitatively with both experimental data and direct numerical simulation for filled elastomeric solids.

  20. Ensemble method: Community detection based on game theory

    NASA Astrophysics Data System (ADS)

    Zhang, Xia; Xia, Zhengyou; Xu, Shengwu; Wang, J. D.

    2014-08-01

    Timely and cost-effective analytics over social network has emerged as a key ingredient for success in many businesses and government endeavors. Community detection is an active research area of relevance to analyze online social network. The problem of selecting a particular community detection algorithm is crucial if the aim is to unveil the community structure of a network. The choice of a given methodology could affect the outcome of the experiments because different algorithms have different advantages and depend on tuning specific parameters. In this paper, we propose a community division model based on the notion of game theory, which can combine advantages of previous algorithms effectively to get a better community classification result. By making experiments on some standard dataset, it verifies that our community detection model based on game theory is valid and better.

  1. Infrared small target detection based on Danger Theory

    NASA Astrophysics Data System (ADS)

    Lan, Jinhui; Yang, Xiao

    2009-11-01

    To solve the problem that traditional method can't detect the small objects whose local SNR is less than 2 in IR images, a Danger Theory-based model to detect infrared small target is presented in this paper. First, on the analog with immunology, the definition is given, in this paper, to such terms as dangerous signal, antigens, APC, antibodies. Besides, matching rule between antigen and antibody is improved. Prior to training the detection model and detecting the targets, the IR images are processed utilizing adaptive smooth filter to decrease the stochastic noise. Then at the training process, deleting rule, generating rule, crossover rule and the mutation rule are established after a large number of experiments in order to realize immediate convergence and obtain good antibodies. The Danger Theory-based model is built after the training process, and this model can detect the target whose local SNR is only 1.5.

  2. Research on Capturing of Customer Requirements Based on Innovation Theory

    NASA Astrophysics Data System (ADS)

    junwu, Ding; dongtao, Yang; zhenqiang, Bao

    To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.

  3. Symmetries and Conservation Laws in Histories-Based Theories

    NASA Astrophysics Data System (ADS)

    Dass, Tulsi; Joglekar, Yogesh N.

    2001-02-01

    Symmetries are defined in histories-based theories, paying special attention to the class of history theories admitting quasi-temporal structure (a generalization of the concept of "temporal sequences" of "events" using partial semigroups) and logic structure for "single-time histories." Symmetries are classified into orthochronous (those preserving the "temporal order" of events) and nonorthochronous. A straightforward criterion for the physical equivalence of histories is formulated in terms of orthochronous symmetries; this criterion covers various notions of physical equivalence of histories considered by Gell-Mann and Hartle (1990, in "Complexity, Entropy, and the Physics of Information" (W. Zurek, Ed.), SFI Studies in the Science of Complexity, Vol. 8, p. 425, Addison-Wesley, Reading, MA) as special cases. In familiar situations, a reciprocal relationship between traditional symmetries (Wigner symmetries in quantum mechanics and Borel-measurable transformations of phase space in classical mechanics) and symmetries defined in this work is established. In a restricted class of theories, definition of a conservation law is given in the history language which agrees with the standard ones in familiar situations; in a smaller subclass of theories, a Noether-type theorem (implying a connection between continuous symmetries of dynamics and conservation laws) is proved. The formalism evolved is applied to histories (of particles, fields, or more general objects) in general curved spacetimes. Sharpening the definition of symmetry so as to include a continuity requirement, it is shown that a symmetry in our formalism implies a conformal isometry of the spacetime metric.

  4. What Communication Theories Can Teach the Designer of Computer-Based Training.

    ERIC Educational Resources Information Center

    Larsen, Ronald E.

    1985-01-01

    Reviews characteristics of computer-based training (CBT) that make application of communication theories appropriate and presents principles from communication theory (e.g., general systems theory, symbolic interactionism, rule theories, and interpersonal communication theories) to illustrate how CBT developers can profitably apply them to…

  5. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  6. A Kendama Learning Robot Based on Bi-directional Theory.

    PubMed

    Kawato, Mitsuo; Wada, Yasuhiro; Nakano, Eri; Osu, Rieko; Koike, Yasuharu; Gomi, Hiroaki; Gandolfo, Francesca; Schaal, Stefan; Miyamoto, Hiroyuki

    1996-11-01

    A general theory of movement-pattern perception based on bi-directional theory for sensory-motor integration can be used for motion capture and learning by watching in robotics. We demonstrate our methods using the game of Kendama, executed by the SARCOS Dextrous Slave Arm, which has a very similar kinematic structure to the human arm. Three ingredients have to be integrated for the successful execution of this task. The ingredients are (1) to extract via-points from a human movement trajectory using a forward-inverse relaxation model, (2) to treat via-points as a control variable while reconstructing the desired trajectory from all the via-points, and (3) to modify the via-points for successful execution. In order to test the validity of the via-point representation, we utilized a numerical model of the SARCOS arm, and examined the behavior of the system under several conditions. Copyright 1996 Elsevier Science Ltd. PMID:12662536

  7. Quantum game theory based on the Schmidt decomposition

    NASA Astrophysics Data System (ADS)

    Ichikawa, Tsubasa; Tsutsui, Izumi; Cheon, Taksu

    2008-04-01

    We present a novel formulation of quantum game theory based on the Schmidt decomposition, which has the merit that the entanglement of quantum strategies is manifestly quantified. We apply this formulation to 2-player, 2-strategy symmetric games and obtain a complete set of quantum Nash equilibria. Apart from those available with the maximal entanglement, these quantum Nash equilibria are extensions of the Nash equilibria in classical game theory. The phase structure of the equilibria is determined for all values of entanglement, and thereby the possibility of resolving the dilemmas by entanglement in the game of Chicken, the Battle of the Sexes, the Prisoners' Dilemma, and the Stag Hunt, is examined. We find that entanglement transforms these dilemmas with each other but cannot resolve them, except in the Stag Hunt game where the dilemma can be alleviated to a certain degree.

  8. A model of resurgence based on behavioral momentum theory.

    PubMed

    Shahan, Timothy A; Sweeney, Mary M

    2011-01-01

    Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforcement during extinction of a target response acts as both an additional source of disruption during extinction and as a source of reinforcement in the context that increases the future strength of the target response. The model does a good job accounting for existing data in the resurgence literature and makes novel and testable predictions. Thus, the model appears to provide a framework for understanding resurgence and serves to integrate the phenomenon into the existing theoretical account of persistence provided by behavioral momentum theory. In addition, we discuss some potential implications of the model for further development of behavioral momentum theory. PMID:21541118

  9. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk. PMID:24194413

  10. A new lightning return stroke model based on antenna theory

    NASA Astrophysics Data System (ADS)

    Moini, Rouzbeh; Kordi, Behzad; Rafi, Gholamreza Z.; Rakov, Vladimir A.

    A new approach based on antenna theory is presented to describe the lightning return-stroke process. The lightning channel is approximated by a straight and vertical monopole antenna with distributed resistance (a so-called lossy antenna) above a perfectly conducting ground. The antenna is fed at its lower end by a voltage source such that the antenna input current, which represents the lightning return-stroke current at the lightning channel base, can be specified. An electric field integral equation (EFIE) in the time domain is employed to describe the electromagnetic behavior of this lossy monopole antenna. The numerical solution of EFIE by the method of moments (MOM) provides the time-space distribution of the current and line charge density along the antenna. This new antenna-theory (or electromagnetic) model with specified current at the channel base requires only two adjustable parameters: the return-stroke propagation speed for a nonresistive channel and the channel resistance per unit length, each assumed to be constant (independent of time and height). The new model is compared to four of the most commonly used ``engineering'' return-stroke models in terms of the temporal-spatial distribution of channel current, the line charge density distribution, and the predicted electromagnetic fields at different distances. A reasonably good agreement is found with the modified transmission line model with linear current decay with height (MTLL) and with the Diendorfer-Uman (DU) model.

  11. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  12. Resource based view: a promising new theory for healthcare organizations

    PubMed Central

    Ferlie, Ewan

    2014-01-01

    This commentary reviews a recent piece by Burton and Rycroft-Malone on the use of Resource Based View (RBV) in healthcare organizations. It first outlines the core content of their piece. It then discusses their attempts to extend RBV to the analysis of large scale quality improvement efforts in healthcare. Some critique is elaborated. The broader question of why RBV seems to be migrating into healthcare management research is considered. They conclude RBV is a promising new theory for healthcare organizations. PMID:25396211

  13. Identifying influential nodes in weighted networks based on evidence theory

    NASA Astrophysics Data System (ADS)

    Wei, Daijun; Deng, Xinyang; Zhang, Xiaoge; Deng, Yong; Mahadevan, Sankaran

    2013-05-01

    The design of an effective ranking method to identify influential nodes is an important problem in the study of complex networks. In this paper, a new centrality measure is proposed based on the Dempster-Shafer evidence theory. The proposed measure trades off between the degree and strength of every node in a weighted network. The influences of both the degree and the strength of each node are represented by basic probability assignment (BPA). The proposed centrality measure is determined by the combination of these BPAs. Numerical examples are used to illustrate the effectiveness of the proposed method.

  14. Transportation Optimization with Fuzzy Trapezoidal Numbers Based on Possibility Theory

    PubMed Central

    He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

    2014-01-01

    In this paper, a parametric method is introduced to solve fuzzy transportation problem. Considering that parameters of transportation problem have uncertainties, this paper develops a generalized fuzzy transportation problem with fuzzy supply, demand and cost. For simplicity, these parameters are assumed to be fuzzy trapezoidal numbers. Based on possibility theory and consistent with decision-makers' subjectiveness and practical requirements, the fuzzy transportation problem is transformed to a crisp linear transportation problem by defuzzifying fuzzy constraints and objectives with application of fractile and modality approach. Finally, a numerical example is provided to exemplify the application of fuzzy transportation programming and to verify the validity of the proposed methods. PMID:25137239

  15. Enhancement of infrared image based on the Retinex theory.

    PubMed

    Li, Ying; Hou, Changzhi; Tian, Fu; Yu, Hongli; Guo, Lei; Xu, Guizhi; Shen, Xueqin; Yan, Weili

    2007-01-01

    The infrared imaging technique can be used to image the temperature distribution of the body. It's hopeful to be applied to the diagnosis and prediction of many diseases. Image processing is necessary to enhance the original infrared images because of the blurring. In this paper, the image enhancement technique based on the Retinex theory is studied. The algorithms such as Frackle-McCann algorithm, McCann99 algorithm, single-scale Retinex algorithm and multi-scale Retinex algorithm are applied to the enhancement of gray infrared image. The acceptable results are obtained and compared. PMID:18002705

  16. Experimental Energy Consumption of Frame Slotted ALOHA and Distributed Queuing for Data Collection Scenarios

    PubMed Central

    Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis; Vilajosana, Xavier

    2014-01-01

    Data collection is a key scenario for the Internet of Things because it enables gathering sensor data from distributed nodes that use low-power and long-range wireless technologies to communicate in a single-hop approach. In this kind of scenario, the network is composed of one coordinator that covers a particular area and a large number of nodes, typically hundreds or thousands, that transmit data to the coordinator upon request. Considering this scenario, in this paper we experimentally validate the energy consumption of two Medium Access Control (MAC) protocols, Frame Slotted ALOHA (FSA) and Distributed Queuing (DQ). We model both protocols as a state machine and conduct experiments to measure the average energy consumption in each state and the average number of times that a node has to be in each state in order to transmit a data packet to the coordinator. The results show that FSA is more energy efficient than DQ if the number of nodes is known a priori because the number of slots per frame can be adjusted accordingly. However, in such scenarios the number of nodes cannot be easily anticipated, leading to additional packet collisions and a higher energy consumption due to retransmissions. Contrarily, DQ does not require to know the number of nodes in advance because it is able to efficiently construct an ad hoc network schedule for each collection round. This kind of a schedule ensures that there are no packet collisions during data transmission, thus leading to an energy consumption reduction above 10% compared to FSA. PMID:25061839

  17. Design 2000: Theory-Based Design Models of the Future.

    ERIC Educational Resources Information Center

    Richey, Rita C.

    The influence of theory on instructional-design models of the future is explored on the basis of the theoretical developments of today. Anticipated model changes are expected to result from disparate theoretical thinking in areas such as chaos theory, constructivism, situated learning, cognitive-learning theory, and general systems theory

  18. Risk Matrix Integrating Risk Attitudes Based on Utility Theory.

    PubMed

    Ruan, Xin; Yin, Zhiyi; Frangopol, Dan M

    2015-08-01

    Recent studies indicate that absence of the consideration of risk attitudes of decisionmakers in the risk matrix establishment process has become a major limitation. In order to evaluate risk in a more comprehensive manner, an approach to establish risk matrices that integrates risk attitudes based on utility theory is proposed. There are three main steps within this approach: (1) describing risk attitudes of decisionmakers by utility functions, (2) bridging the gap between utility functions and the risk matrix by utility indifference curves, and (3) discretizing utility indifference curves. A complete risk matrix establishment process based on practical investigations is introduced. This process utilizes decisionmakers' answers to questionnaires to formulate required boundary values for risk matrix establishment and utility functions that effectively quantify their respective risk attitudes. PMID:25958890

  19. Feature Selection with Neighborhood Entropy-Based Cooperative Game Theory

    PubMed Central

    Zeng, Kai; She, Kun; Niu, Xinzheng

    2014-01-01

    Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT) yield better performance than classical ones. PMID:25276120

  20. Master equation based steady-state cluster perturbation theory

    NASA Astrophysics Data System (ADS)

    Nuss, Martin; Dorn, Gerhard; Dorda, Antonius; von der Linden, Wolfgang; Arrigoni, Enrico

    2015-09-01

    A simple and efficient approximation scheme to study electronic transport characteristics of strongly correlated nanodevices, molecular junctions, or heterostructures out of equilibrium is provided by steady-state cluster perturbation theory. In this work, we improve the starting point of this perturbative, nonequilibrium Green's function based method. Specifically, we employ an improved unperturbed (so-called reference) state ??S, constructed as the steady state of a quantum master equation within the Born-Markov approximation. This resulting hybrid method inherits beneficial aspects of both the quantum master equation as well as the nonequilibrium Green's function technique. We benchmark this scheme on two experimentally relevant systems in the single-electron transistor regime: an electron-electron interaction based quantum diode and a triple quantum dot ring junction, which both feature negative differential conductance. The results of this method improve significantly with respect to the plain quantum master equation treatment at modest additional computational cost.

  1. Theory based design and optimization of materials for spintronics applications

    NASA Astrophysics Data System (ADS)

    Xu, Tianyi

    The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.

  2. Classification of topological crystalline insulators based on representation theory

    NASA Astrophysics Data System (ADS)

    Dong, Xiao-Yu; Liu, Chao-Xing

    2016-01-01

    Topological crystalline insulators define a new class of topological insulator phases with gapless surface states protected by crystalline symmetries. In this work, we present a general theory to classify topological crystalline insulator phases based on the representation theory of space groups. Our approach is to directly identify possible nontrivial surface states in a semi-infinite system with a specific surface, of which the symmetry property can be described by 17 two-dimensional space groups. We reproduce the existing results of topological crystalline insulators, such as mirror Chern insulators in the p m or p m m groups, Cn v topological insulators in the p 4 m ,p 31 m , and p 6 m groups, and topological nonsymmorphic crystalline insulators in the p g and p m g groups. Aside from these existing results, we also obtain the following results: (1) there are two integer mirror Chern numbers (Z2) in the p m group but only one (Z ) in the c m or p 3 m 1 group for both the spinless and spinful cases; (2) for the p m m (c m m ) groups, there is no topological classification in the spinless case but Z4 (Z2) classifications in the spinful case; (3) we show how topological crystalline insulator phase in the p g group is related to that in the p m group; (4) we identify topological classification of the p 4 m ,p 31 m , and p 6 m for the spinful case; (5) we find topological nonsymmorphic crystalline insulators also existing in p g g and p 4 g groups, which exhibit new features compared to those in p g and p m g groups. We emphasize the importance of the irreducible representations for the states at some specific high-symmetry momenta in the classification of topological crystalline phases. Our theory can serve as a guide for the search of topological crystalline insulator phases in realistic materials.

  3. Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis

    ERIC Educational Resources Information Center

    Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.

    2012-01-01

    Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…

  4. Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis

    ERIC Educational Resources Information Center

    Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.

    2012-01-01

    Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…

  5. A molecularly based theory for electron transfer reorganization energy.

    PubMed

    Zhuang, Bilin; Wang, Zhen-Gang

    2015-12-14

    Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule's permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory. PMID:26671385

  6. A molecularly based theory for electron transfer reorganization energy

    NASA Astrophysics Data System (ADS)

    Zhuang, Bilin; Wang, Zhen-Gang

    2015-12-01

    Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule's permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory.

  7. The theory and phenomenology of perturbative QCD based jet quenching

    NASA Astrophysics Data System (ADS)

    Majumder, A.; van Leeuwen, M.

    2011-01-01

    The study of the structure of strongly interacting dense matter via hard jets is reviewed. High momentum partons produced in hard collisions produce a shower of gluons prior to undergoing the non-perturbative process of hadronization. In the presence of a dense medium this shower is modified due to scattering of the various partons off the constituents in the medium. The modified pattern of the final detected hadrons is then a probe of the structure of the medium as perceived by the jet. Starting from the factorization paradigm developed for the case of particle collisions, we review the basic underlying theory of medium induced gluon radiation based on perturbative Quantum Chromo Dynamics (pQCD) and current experimental results from Deep Inelastic Scattering on large nuclei and high energy heavy-ion collisions, emphasizing how these results constrain our understanding of energy loss. This review contains introductions to the theory of radiative energy loss, elastic energy loss, and the corresponding experimental observables and issues. We close with a discussion of important calculations and measurements that need to be carried out to complete the description of jet modification at high energies at future high energy colliders.

  8. Stochastic extension of cellular manufacturing systems: a queuing-based analysis

    NASA Astrophysics Data System (ADS)

    Fardis, Fatemeh; Zandi, Afagh; Ghezavati, Vahidreza

    2013-07-01

    Clustering parts and machines into part families and machine cells is a major decision in the design of cellular manufacturing systems which is defined as cell formation. This paper presents a non-linear mixed integer programming model to design cellular manufacturing systems which assumes that the arrival rate of parts into cells and machine service rate are stochastic parameters and described by exponential distribution. Uncertain situations may create a queue behind each machine; therefore, we will consider the average waiting time of parts behind each machine in order to have an efficient system. The objective function will minimize summation of idleness cost of machines, sub-contracting cost for exceptional parts, non-utilizing machine cost, and holding cost of parts in the cells. Finally, the linearized model will be solved by the Cplex solver of GAMS, and sensitivity analysis will be performed to illustrate the effectiveness of the parameters.

  9. Synchronization in complex systems following a decision based queuing process: rhythmic applause as a test case

    NASA Astrophysics Data System (ADS)

    Xenides, D.; Vlachos, D. S.; Simos, T. E.

    2008-07-01

    Living communities can be considered as complex systems, and are thus fertile grounds for studies related to statistics and dynamics. In this study we revisit the case of rhythmic applause by utilizing the model proposed by Vázquez et al (2006 Phys. Rev. E 73 036127) augmented with two opposing driving forces, namely the desires for individuality and companionship. To that end, after performing computer simulations with a large number of oscillators we propose an explanation on the following open questions: (a) Why does synchronization occur suddenly? (b) Why is synchronization observed when the clapping period (Tc) is 1.5 × Ts

  10. Optimisation of a honeybee-colony's energetics via social learning based on queuing delays

    NASA Astrophysics Data System (ADS)

    Thenius, Ronald; Schmickl, Thomas; Crailsheim, Karl

    2008-06-01

    Natural selection shaped the foraging-related processes of honeybees in such a way that a colony can react to changing environmental conditions optimally. To investigate this complex dynamic social system, we developed a multi-agent model of the nectar flow inside and outside of a honeybee colony. In a honeybee colony, a temporal caste collects nectar in the environment. These foragers bring their harvest into the colony, where they unload their nectar loads to one or more storer bees. Our model predicts that a cohort of foragers, collecting nectar from a single nectar source, is able to detect changes in quality in other food sources they have never visited, via the nectar processing system of the colony. We identified two novel pathways of forager-to-forager communication. Foragers can gain information about changes in the nectar flow in the environment via changes in their mean waiting time for unloadings and the number of experienced multiple unloadings. This way two distinct groups of foragers that forage on different nectar sources and that never communicate directly can share information via a third cohort of worker bees. We show that this noisy and loosely knotted social network allows a colony to perform collective information processing, so that a single forager has all necessary information available to be able to 'tune' its social behaviour, like dancing or dance-following. This way the net nectar gain of the colony is increased.

  11. Intelligent control based on fuzzy logic and neural net theory

    NASA Technical Reports Server (NTRS)

    Lee, Chuen-Chien

    1991-01-01

    In the conception and design of intelligent systems, one promising direction involves the use of fuzzy logic and neural network theory to enhance such systems' capability to learn from experience and adapt to changes in an environment of uncertainty and imprecision. Here, an intelligent control scheme is explored by integrating these multidisciplinary techniques. A self-learning system is proposed as an intelligent controller for dynamical processes, employing a control policy which evolves and improves automatically. One key component of the intelligent system is a fuzzy logic-based system which emulates human decision making behavior. It is shown that the system can solve a fairly difficult control learning problem. Simulation results demonstrate that improved learning performance can be achieved in relation to previously described systems employing bang-bang control. The proposed system is relatively insensitive to variations in the parameters of the system environment.

  12. Time-dependent density functional theory based Ehrenfest dynamics.

    PubMed

    Wang, Fan; Yam, Chi Yung; Hu, LiHong; Chen, GuanHua

    2011-07-28

    Time-dependent density functional theory based Ehrenfest dynamics with atom-centered basis functions is developed in present work. The equation of motion for electrons is formulated in terms of first-order reduced density matrix and an additional term arises due to the time-dependence of basis functions through their dependence on nuclear coordinates. This time-dependence of basis functions together with the imaginary part of density matrix leads to an additional term for nuclear force. The effects of the two additional terms are examined by studying the dynamics of H(2) and C(2)H(4), and it is concluded that the inclusion of these two terms is essential for correct electronic and nuclear dynamics. PMID:21806109

  13. Xinjiang resources efficiency based on superior technical theory

    NASA Astrophysics Data System (ADS)

    Amut, Aniwaer; Li, Zeyuan

    2005-09-01

    The new concept about the resource efficiency in Xinjiang has been discussed in this study based on the advanced technology theory in policy making perspective. The analysis is focused on the resources advantage in the development, resource pressure, resource efficiency and technical approach to resource efficiency. The idea of industrialized development centered on resource efficiency, its control factors and basic technical framework for realization of resource efficiency factors, which include technique in application of recycled materials; water-saving technique oriented for efficiency in applying water resource; bio-technology for high yield and better quality of farm crops; comprehensive technique involved in farm-produce further process and agricultural industrialization; information technology around information support and information-oriented society; technique in transforming resources including oil and natural gas, mineral products and wind power; technique in control of desertification and biological security.

  14. Theory for a gas composition sensor based on acoustic properties.

    PubMed

    Phillips, Scott; Dain, Yefim; Lueptow, Richard M

    2003-01-01

    Sound travelling through a gas propagates at different speeds and its intensity attenuates to different degrees depending upon the composition of the gas. Theoretically, a real-time gaseous composition sensor could be based on measuring the sound speed and the acoustic attenuation. To this end, the speed of sound was modelled using standard relations, and the acoustic attenuation was modelled using the theory for vibrational relaxation of gas molecules. The concept for a gas composition sensor is demonstrated theoretically for nitrogen-methane-water and hydrogen-oxygen-water mixtures. For a three-component gas mixture, the measured sound speed and acoustic attenuation each define separate lines in the composition plane of two of the gases. The intersection of the two lines defines the gas composition. It should also be possible to use the concept for mixtures of more than three components, if the nature of the gas composition is known to some extent. PMID:14552356

  15. Theory for a gas composition sensor based on acoustic properties

    NASA Technical Reports Server (NTRS)

    Phillips, Scott; Dain, Yefim; Lueptow, Richard M.

    2003-01-01

    Sound travelling through a gas propagates at different speeds and its intensity attenuates to different degrees depending upon the composition of the gas. Theoretically, a real-time gaseous composition sensor could be based on measuring the sound speed and the acoustic attenuation. To this end, the speed of sound was modelled using standard relations, and the acoustic attenuation was modelled using the theory for vibrational relaxation of gas molecules. The concept for a gas composition sensor is demonstrated theoretically for nitrogen-methane-water and hydrogen-oxygen-water mixtures. For a three-component gas mixture, the measured sound speed and acoustic attenuation each define separate lines in the composition plane of two of the gases. The intersection of the two lines defines the gas composition. It should also be possible to use the concept for mixtures of more than three components, if the nature of the gas composition is known to some extent.

  16. A discrete event simulation model for evaluating the performances of an m/g/c/c state dependent queuing system.

    PubMed

    Khalid, Ruzelan; Nawawi, Mohd Kamal M; Kawsar, Luthful A; Ghani, Noraida A; Kamil, Anton A; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  17. A Discrete Event Simulation Model for Evaluating the Performances of an M/G/C/C State Dependent Queuing System

    PubMed Central

    Khalid, Ruzelan; M. Nawawi, Mohd Kamal; Kawsar, Luthful A.; Ghani, Noraida A.; Kamil, Anton A.; Mustafa, Adli

    2013-01-01

    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed. PMID:23560037

  18. The elliptic wing based on the potential theory

    NASA Technical Reports Server (NTRS)

    Krienes, Klaus

    1941-01-01

    This article is intended as a contribution to the theory of the lifting surface. The aerodynamics of the elliptic wing in straight and oblique flow are explored on the basis of potential theory. The foundation of the calculation is the linearized theory of the acceleration potential in which all small quantities of higher order are disregarded.

  19. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  20. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  1. a Classification Algorithm for Hyperspectral Data Based on Synergetics Theory

    NASA Astrophysics Data System (ADS)

    Cerra, D.; Mueller, R.; Reinartz, P.

    2012-07-01

    This paper presents a new classification methodology for hyperspectral data based on synergetics theory, which describes the spontaneous formation of patterns and structures in a system through self-organization. We introduce a representation for hyperspectral data, in which a spectrum can be projected in a space spanned by a set of user-defined prototype vectors, which belong to some classes of interest. Each test vector is attracted by a final state associated to a prototype, and can be thus classified. As typical synergetics-based systems have the drawback of a rigid training step, we modify it to allow the selection of user-defined training areas, used to weight the prototype vectors through attention parameters and to produce a more accurate classification map through majority voting of independent classifications. Results are comparable to state of the art classification methodologies, both general and specific to hyperspectral data and, as each classification is based on a single training sample per class, the proposed technique would be particularly effective in tasks where only a small training dataset is available.

  2. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  3. A Theory of Conditioning: Inductive Learning within Rule-Based Default Hierarchies.

    ERIC Educational Resources Information Center

    Holyoak, Keith J.; And Others

    1989-01-01

    A theory of classical conditioning is presented, which is based on a parallel, rule-based performance system integrated with mechanisms for inductive learning. A major inferential heuristic incorporated into the theory involves "unusualness," which is focused on novel cues. The theory is implemented via computer simulation. (TJH)

  4. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  5. SARA: A Text-Based and Reader-Based Theory of Signaling

    ERIC Educational Resources Information Center

    Lemarie, Julie; Lorch, Robert F., Jr.; Eyrolle, Helene; Virbel, Jacques

    2008-01-01

    We propose a two-component theory of text signaling devices. The first component is a text-based analysis that characterizes any signaling device along four dimensions: (a) the type of information it makes available, (b) its scope, (c) how it is realized in the text, and (d) its location with respect to the content it cues. The second component is…

  6. Image integrity authentication scheme based on fixed point theory.

    PubMed

    Li, Xu; Sun, Xingming; Liu, Quansheng

    2015-02-01

    Based on the fixed point theory, this paper proposes a new scheme for image integrity authentication, which is very different from digital signature and fragile watermarking. By the new scheme, the sender transforms an original image into a fixed point image (very close to the original one) of a well-chosen transform and sends the fixed point image (instead of the original one) to the receiver; using the same transform, the receiver checks the integrity of the received image by testing whether it is a fixed point image and locates the tampered areas if the image has been modified during the transmission. A realization of the new scheme is based on Gaussian convolution and deconvolution (GCD) transform, for which an existence theorem of fixed points is proved. The semifragility is analyzed via commutativity of transforms, and three commutativity theorems are found for the GCD transform. Three iterative algorithms are presented for finding a fixed point image with a few numbers of iterations, and for the whole procedure of image integrity authentication; a fragile authentication system and a semifragile one are separately built. Experiments show that both the systems have good performance in transparence, fragility, security, and tampering localization. In particular, the semifragile system can perfectly resist the rotation by a multiple of 90° flipping and brightness attacks. PMID:25420259

  7. Charge carrier hopping transport based on Marcus theory and variable-range hopping theory in organic semiconductors

    NASA Astrophysics Data System (ADS)

    Lu, Nianduan; Li, Ling; Banerjee, Writam; Sun, Pengxiao; Gao, Nan; Liu, Ming

    2015-07-01

    Charge carrier hopping transport is generally taken from Miller-Abrahams and Marcus transition rates. Based on the Miller-Abrahams theory and nearest-neighbour range hopping theory, Apsley and Hughes developed a concise calculation method (A-H method) to study the hopping conduction in disordered systems. Here, we improve the A-H method to investigate the charge carrier hopping transport by introducing polaron effect and electric field based on Marcus theory and variable-range hopping theory. This improved method can well describe the contribution of polaron effect, energetic disorder, carrier density, and electric field to the charge carrier transport in disordered organic semiconductor. In addition, the calculated results clearly show that the charge carrier mobility represents different polaron effect dependence with the polaron activation energy and decreases with increasing electric field strength for large fields.

  8. Density functional theory based generalized effective fragment potential method.

    PubMed

    Nguyen, Kiet A; Pachter, Ruth; Day, Paul N

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes. PMID:24985612

  9. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA supervised algorithms. Graphic representation for Shannon's distribution of MD calculating software. PMID:25620721

  10. Density functional theory based generalized effective fragment potential method

    SciTech Connect

    Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  11. The use of theory-based nursing practice in the Department of Veterans' Affairs Medical Centers.

    PubMed

    Bonamy, C; Schultz, P; Graham, K; Hampton, M

    1995-01-01

    In this study, the authors surveyed the chief nurses of 152 Veterans' Health Administration Medical Centers to determine which medical centers based their nursing practice on one or more nursing theories or models. Of the 76 medical centers responding, 24 (35%) stated theory-based practice was in use in their institutions. The greatest number (16 of the 24) reported use of Orem's Self-Care Deficit theory or a combination of Orem with other theories. Most of the 24 chief nurses agreed that theory-based practice: 1) improves patient outcomes; 2) maximizes patient health; and 3) provides a consistent approach to care. However, they were less convinced that theory-based practice reduces nursing staff turnover or improves job satisfaction. They also stated that theory-based practice is more important to nursing administrators than to staff nurses. A similar survey of staff nurses is recommended. PMID:7869135

  12. Cell division theory and individual-based modeling of microbial lag: part I. The theory of cell division.

    PubMed

    Dens, E J; Bernaerts, K; Standaert, A R; Van Impe, J F

    2005-06-15

    This series of two papers deals with the theory of cell division and its implementation in an individual-based modeling framework. In this first part, the theory of cell division is studied on an individual-based level in order to learn more about the mechanistic principles behind microbial lag phenomena. While some important literature on cell division theory dates from 30 to 40 years ago, until now it has hardly been introduced in the field of predictive microbiology. Yet, it provides a large amount of information on how cells likely respond to changing environmental conditions. On the basis of this theory, a general theory on microbial lag behavior caused by a combination of medium and/or temperature changes has been developed in this paper. The proposed theory then forms the basis for a critical evaluation of existing modeling concepts for microbial lag in predictive microbiology. First of all, a more thorough definition can be formulated to define the lag time lambda and the previously only vaguely defined physiological state of the cells in terms of mechanistically defined parameters like cell mass, RNA or protein content, specific growth rate and time to perform DNA replication and cell division. On the other hand, existing predictive models are evaluated with respect to the newly developed theory. For the model of , a certain fitting parameter can also be related to physically meaningful parameters while for the model of [Augustin, J.-C., Rosso, L., Carlier, V.A. 2000. A model describing the effect of temperature history on lag time for Listeria monocytogenes. Int. J. Food Microbiol. 57, 169-181] a new, mechanistically based, model structure is proposed. A restriction of the proposed theory is that it is only valid for situations where biomass growth responds instantly to an environment change. The authors are aware of the fact that this assumption is not generally acceptable. Lag in biomass can be caused, for example, by a delayed synthesis of some essential growth factor (e.g., enzymes). In the second part of this series of papers [Dens, E.J., Bernaerts, K., Standaert, A.R., Kreft, J.-U., Van Impe, J.F., this issue. Cell division theory and individual-based modeling of microbial lag: part II. Modeling lag phenomena induced by temperature shifts. Int. J. Food Microbiol], the theory of cell division is implemented in an individual-based simulation program and extended to account for lags in biomass growth. In conclusion, the cell division theory applied to microbial populations in dynamic medium and/or temperature conditions provides a useful framework to analyze microbial lag behavior. PMID:15925713

  13. LSST Telescope Alignment Plan Based on Nodal Aberration Theory

    NASA Astrophysics Data System (ADS)

    Sebag, J.; Gressler, W.; Schmid, T.; Rolland, J. P.; Thompson, K. P.

    2012-04-01

    The optical alignment of the Large Synoptic Survey Telescope (LSST) is potentially challenging, due to its fast three-mirror optical design and its large 3.5° field of view (FOV). It is highly advantageous to align the three-mirror optical system prior to the integration of the complex science camera on the telescope, which corrects the FOV via three refractive elements and includes the operational wavefront sensors. A telescope alignment method based on nodal aberration theory (NAT) is presented here to address this challenge. Without the science camera installed on the telescope, the on-axis imaging performance of the telescope is diffraction-limited, but the field of view is not corrected. The nodal properties of the three-mirror telescope design have been analyzed and an alignment approach has been developed using the intrinsically linear nodal behavior, which is linked via sensitivities to the misalignment parameters. Since mirror figure errors will exist in any real application, a methodology to introduce primary-mirror figure errors into the analysis has been developed and is also presented.

  14. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    PubMed

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  15. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  16. Scheduling for indoor visible light communication based on graph theory.

    PubMed

    Tao, Yuyang; Liang, Xiao; Wang, Jiaheng; Zhao, Chunming

    2015-02-01

    Visible light communication (VLC) has drawn much attention in the field of high-rate indoor wireless communication. While most existing works focused on point-to-point VLC technologies, few studies have concerned multiuser VLC, where multiple optical access points (APs) transmit data to multiple user receivers. In such scenarios, inter-user interference constitutes the major factor limiting the system performance. Therefore, a proper scheduling scheme has to be proposed to coordinate the interference and optimize the whole system performance. In this work, we aim to maximize the sum rate of the system while taking into account user fairness by appropriately assigning LED lamps to multiple users. The formulated scheduling problem turns out to be a maximum weighted independent set problem. We then propose a novel and efficient resource allocation method based on graph theory to achieve high sum rates. Moreover, we also introduce proportional fairness into our scheduling scheme to ensure the user fairness. Our proposed scheduling scheme can, with low complexity, achieve more multiplexing gains, higher sum rate, and better fairness than the existing works. PMID:25836136

  17. An Approach to Theory-Based Youth Programming

    ERIC Educational Resources Information Center

    Duerden, Mat D.; Gillard, Ann

    2011-01-01

    A key but often overlooked aspect of intentional, out-of-school-time programming is the integration of a guiding theoretical framework. The incorporation of theory in programming can provide practitioners valuable insights into essential processes and principles of successful programs. While numerous theories exist that relate to youth development…

  18. Mending metacognitive illusions: a comparison of mnemonic-based and theory-based procedures.

    PubMed

    Koriat, Asher; Bjork, Robert A

    2006-09-01

    Previous research indicated that learners experience an illusion of competence during learning (termed foresight bias) because judgments of learning (JOLs) are made in the presence of information that will be absent at test. The authors examined the following 2 procedures for alleviating foresight bias: enhancing learners' sensitivity to mnemonic cues pertaining to ease of retrieval and inducing learners to resort to theory-based judgments as a basis for JOLs. Both procedures proved effective in mending metacognitive illusions-as reflected in JOLs and self-regulation of study time-but only theory-based debiasing yielded transfer to new items. The results support the notion that improved metacognition is 1 key to optimizing transfer but also that educating subjective experience does not guarantee generalization to new situations. PMID:16938051

  19. The Development of an Attribution-Based Theory of Motivation: A History of Ideas

    ERIC Educational Resources Information Center

    Weiner, Bernard

    2010-01-01

    The history of ideas guiding the development of an attribution-based theory of motivation is presented. These influences include the search for a "grand" theory of motivation (from drive and expectancy/value theory), an attempt to represent how the past may influence the present and the future (as Thorndike accomplished), and the incorporation of…

  20. Kinetic theory based modeling of Type II core collapse supernovae

    NASA Astrophysics Data System (ADS)

    Strother, Terrance; Bauer, Wolfgang

    2010-06-01

    Motivated by the success of kinetic theory in the description of observables in intermediate and high energy heavy ion collisions, we use kinetic theory to model the dynamics of core collapse supernovae. The specific way that we employ kinetic theory to solve the relevant transport equations allows us to explicitly model the propagation of neutrinos and a full ensemble of nuclei and treat neutrino-matter interactions in a very general way. With these abilities, our preliminary calculations have observed dynamics that may prove to be an entirely new neutrino capture induced supernova explosion mechanism.

  1. Seismic site-response analysis based on random vibration theory

    NASA Astrophysics Data System (ADS)

    Kang, T.; Jang, H.

    2013-12-01

    Local geology influences earthquake ground motions, which is of importance in specifying ground motion levels for seismic design in practice. This effect is quantified through site response analysis, which involves the propagation of seismic waves from bedrock to the free surface through soft layers. Site response analysis provides a set or several sets of scale factors given as function of frequency at the surface. Empirical characterization of site response requires a large data set over a wide range of magnitudes and distances of events. In reality, especially in low to moderate seismicity regions such as the Korean Peninsula, empirical characterization of site response is not plausible. Thus numerical modeling is only a viable tool for site response in those regions. On the other hand, most of conventional modeling procedures include a step for developing some appropriate synthetic waveforms as input motions to be used in site response analyses. The waveforms are typically synthesized by matching the spectrum, such as uniform hazard response spectrum, on basement rock obtained from the seismic hazard analysis. However, these synthetics are fundamentally problematic in spite of spectral matching because it is based on the amplitude spectrum only without phase information. As an alternative, an approach based on random vibration theory (RVT) is introduced without the need of waveform generations. RVT explains that a given response spectrum can be converted into a power spectrum density function. It is performed in the frequency domain and deals with the statistical representation of responses. It requires the transfer function for the velocity profile of a site. The transfer function is initially developed by computations of receiver functions using the reflectivity method assuming no attenuation for the profile under consideration of various incidence angles. Then the transfer function is iteratively updated with varying attenuation until the results are compatible with the observed modulus and damping which can be obtained through the in-situ or lab tests for the profile. After the final iteration on the transfer function, the maximum amplification responses can be obtained with the extreme values of shear stress and strain on the profile. Thus this approach combines the observational results of material properties with the analytical results based on the reflectivity calculations of a layered structure, which makes it able to estimate site response in reducing unphysical manipulations.

  2. The Application of Carl Rogers' Person-Centered Learning Theory to Web-Based Instruction.

    ERIC Educational Resources Information Center

    Miller, Christopher T.

    This paper provides a review of literature that relates research on Carl Rogers' person-centered learning theory to Web-based learning. Based on the review of the literature, a set of criteria is described that can be used to determine how closely a Web-based course matches the different components of Rogers' person-centered learning theory. Using…

  3. [Peplau's theory of interpersonal relations: an analysis based of Barnum].

    PubMed

    de Almeida, Vitória de Cássia Félix; Lopes, Marcos Venícios de Oliveira; Damasceno, Marta Maria Coelho

    2005-06-01

    The use of theories in Nursing reflects a movement in the profession towards the autonomy and delimitation of its actions. It is, therefore, extremely relevant that theories may be analyzed as for their applicability to practice. The object of this study is to make an analytical-descriptive study of Peplau's Theory of Interpersonal Relations in Nursing from the model of analysis proposed by Barbara Barnum. Among the structural components that may be analyzed in a theory was chosen the element "process", a method recommended for the development of nursing' actions, which was submitted to Barnum's criteria of usefulness. The assessment showed that Peplau's theoretical presuppositions are operational and may serve as a basis in any situation in which nurses communicate and interact with his/her patients. PMID:16060308

  4. Prior individual training and self-organized queuing during group emergency escape of mice from water pool.

    PubMed

    Saloma, Caesar; Perez, Gay Jane; Gavile, Catherine Ann; Ick-Joson, Jacqueline Judith; Palmes-Saloma, Cynthia

    2015-01-01

    We study the impact of prior individual training during group emergency evacuation using mice that escape from an enclosed water pool to a dry platform via any of two possible exits. Experimenting with mice avoids serious ethical and legal issues that arise when dealing with unwitting human participants while minimizing concerns regarding the reliability of results obtained from simulated experiments using 'actors'. First, mice were trained separately and their individual escape times measured over several trials. Mice learned quickly to swim towards an exit-they achieved their fastest escape times within the first four trials. The trained mice were then placed together in the pool and allowed to escape. No two mice were permitted in the pool beforehand and only one could pass through an exit opening at any given time. At first trial, groups of trained mice escaped seven and five times faster than their corresponding control groups of untrained mice at pool occupancy rate ? of 11.9% and 4%, respectively. Faster evacuation happened because trained mice: (a) had better recognition of the available pool space and took shorter escape routes to an exit, (b) were less likely to form arches that blocked an exit opening, and (c) utilized the two exits efficiently without preference. Trained groups achieved continuous egress without an apparent leader-coordinator (self-organized queuing)-a collective behavior not experienced during individual training. Queuing was unobserved in untrained groups where mice were prone to wall seeking, aimless swimming and/or blind copying that produced circuitous escape routes, biased exit use and clogging. The experiments also reveal that faster and less costly group training at ? = 4%, yielded an average individual escape time that is comparable with individualized training. However, group training in a more crowded pool (? = 11.9%) produced a longer average individual escape time. PMID:25693170

  5. Toward A Brain-Based Theory of Beauty

    PubMed Central

    Ishizu, Tomohiro; Zeki, Semir

    2011-01-01

    We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1–9, with 9 being the most beautiful. This allowed us to select three sets of stimuli–beautiful, indifferent and ugly–which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources–musical and visual–and probably by other sources as well. This has led us to formulate a brain-based theory of beauty. PMID:21755004

  6. Reconceptualizing the Theory-Base of Educational Technology: Re-opening the Theory-Practice Debates.

    ERIC Educational Resources Information Center

    Koetting, J. Randall

    Freire's model of emancipatory education is one alternative to the behaviorist theory of education predominant in the field of educational technology. The educational context, the teaching/learning situation, is an extremely complex situation. Reducing this situation to a question of inputs and outputs oversimplifies the many facets of education.…

  7. Predicting the Number of Public Computer Terminals Needed for an On-Line Catalog: A Queuing Theory Approach.

    ERIC Educational Resources Information Center

    Knox, A. Whitney; Miller, Bruce A.

    1980-01-01

    Describes a method for estimating the number of cathode ray tube terminals needed for public use of an online library catalog. Authors claim method could also be used to estimate needed numbers of microform readers for a computer output microform (COM) catalog. Formulae are included. (Author/JD)

  8. Enhancing Student Learning in Knowledge-Based Courses: Integrating Team-Based Learning in Mass Communication Theory Classes

    ERIC Educational Resources Information Center

    Han, Gang; Newell, Jay

    2014-01-01

    This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…

  9. Enhancing Student Learning in Knowledge-Based Courses: Integrating Team-Based Learning in Mass Communication Theory Classes

    ERIC Educational Resources Information Center

    Han, Gang; Newell, Jay

    2014-01-01

    This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…

  10. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  11. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  12. A Theory-Driven Integrative Process/Outcome Evaluation of a Concept-Based Nursing Curriculum

    ERIC Educational Resources Information Center

    Fromer, Rosemary F.

    2013-01-01

    The current trend in curriculum revision in nursing education is concept-based learning, but little research has been done on concept-based curricula in nursing education. The study used a theory-driven integrative process/outcome evaluation. Embedded in this theory-driven integrative process/outcome evaluation was a causal comparative…

  13. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  14. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  15. Receiver-Coupling Schemes Based On Optimal-Estimation Theory

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra

    1992-01-01

    Two schemes for reception of weak radio signals conveying digital data via phase modulation provide for mutual coupling of multiple receivers, and coherent combination of outputs of receivers. In both schemes, optimal mutual-coupling weights computed according to Kalman-filter theory, but differ in manner of transmission and combination of outputs of receivers.

  16. Effective Contraceptive Use: An Exploration of Theory-Based Influences

    ERIC Educational Resources Information Center

    Peyman, N.; Oakley, D.

    2009-01-01

    The purpose of this study was to explore factors that influence oral contraceptive (OC) use among women in Iran using the Theory of Planned Behavior (TPB) and concept of self-efficacy (SE). The study sample consisted of 360 married OC users, aged 18-49 years recruited at public health centers of Mashhad, 900 km east of Tehran. SE had the strongest…

  17. Logical Thinking in Children; Research Based on Piaget's Theory.

    ERIC Educational Resources Information Center

    Sigel, Irving E., Ed.; Hooper, Frank H., Ed.

    Theoretical and empirical research derived from Piagetian theory is collected on the intellectual development of the elementary school child and his acquisition and utilization of conservation concepts. The articles present diversity of method and motive in the results of replication (validation studies of the description of cognitive growth) and…

  18. Stability Analysis for Car Following Model Based on Control Theory

    NASA Astrophysics Data System (ADS)

    Meng, Xiang-Pei; Li, Zhi-Peng; Ge, Hong-Xia

    2014-05-01

    Stability analysis is one of the key issues in car-following theory. The stability analysis with Lyapunov function for the two velocity difference car-following model (for short, TVDM) is conducted and the control method to suppress traffic congestion is introduced. Numerical simulations are given and results are consistent with the theoretical analysis.

  19. Renormalization group method based on the ionization energy theory

    SciTech Connect

    Arulsamy, Andrew Das

    2011-03-15

    Proofs are developed to explicitly show that the ionization energy theory is a renormalized theory, which mathematically exactly satisfies the renormalization group formalisms developed by Gell-Mann-Low, Shankar and Zinn-Justin. However, the cutoff parameter for the ionization energy theory relies on the energy-level spacing, instead of lattice point spacing in k-space. Subsequently, we apply the earlier proofs to prove that the mathematical structure of the ionization-energy dressed electron-electron screened Coulomb potential is exactly the same as the ionization-energy dressed electron-phonon interaction potential. The latter proof is proven by means of the second-order time-independent perturbation theory with the heavier effective mass condition, as required by the electron-electron screened Coulomb potential. The outcome of this proof is that we can derive the heat capacity and the Debye frequency as a function of ionization energy, which can be applied in strongly correlated matter and nanostructures.

  20. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Méndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

  1. Effective Contraceptive Use: An Exploration of Theory-Based Influences

    ERIC Educational Resources Information Center

    Peyman, N.; Oakley, D.

    2009-01-01

    The purpose of this study was to explore factors that influence oral contraceptive (OC) use among women in Iran using the Theory of Planned Behavior (TPB) and concept of self-efficacy (SE). The study sample consisted of 360 married OC users, aged 18-49 years recruited at public health centers of Mashhad, 900 km east of Tehran. SE had the strongest…

  2. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Méndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

  3. Capacity and delay estimation for roundabouts using conflict theory.

    PubMed

    Qu, Zhaowei; Duan, Yuzhou; Hu, Hongyu; Song, Xianmin

    2014-01-01

    To estimate the capacity of roundabouts more accurately, the priority rank of each stream is determined through the classification technique given in the Highway Capacity Manual 2010 (HCM2010), which is based on macroscopical analysis of the relationship between entry flow and circulating flow. Then a conflict matrix is established using the additive conflict flow method and by considering the impacts of traffic characteristics and limited priority with high volume. Correspondingly, the conflict relationships of streams are built using probability theory. Furthermore, the entry capacity model of roundabouts is built, and sensitivity analysis is conducted on the model parameters. Finally, the entrance delay model is derived using queuing theory, and the proposed capacity model is compared with the model proposed by Wu and that in the HCM2010. The results show that the capacity calculated by the proposed model is lower than the others for an A-type roundabout, while it is basically consistent with the estimated values from HCM2010 for a B-type roundabout. PMID:24982982

  4. Capacity and Delay Estimation for Roundabouts Using Conflict Theory

    PubMed Central

    Qu, Zhaowei; Duan, Yuzhou; Hu, Hongyu; Song, Xianmin

    2014-01-01

    To estimate the capacity of roundabouts more accurately, the priority rank of each stream is determined through the classification technique given in the Highway Capacity Manual 2010 (HCM2010), which is based on macroscopical analysis of the relationship between entry flow and circulating flow. Then a conflict matrix is established using the additive conflict flow method and by considering the impacts of traffic characteristics and limited priority with high volume. Correspondingly, the conflict relationships of streams are built using probability theory. Furthermore, the entry capacity model of roundabouts is built, and sensitivity analysis is conducted on the model parameters. Finally, the entrance delay model is derived using queuing theory, and the proposed capacity model is compared with the model proposed by Wu and that in the HCM2010. The results show that the capacity calculated by the proposed model is lower than the others for an A-type roundabout, while it is basically consistent with the estimated values from HCM2010 for a B-type roundabout. PMID:24982982

  5. Local control theory in trajectory-based nonadiabatic dynamics

    SciTech Connect

    Curchod, Basile F. E.; Penfold, Thomas J.; Rothlisberger, Ursula; Tavernelli, Ivano

    2011-10-15

    In this paper, we extend the implementation of nonadiabatic molecular dynamics within the framework of time-dependent density-functional theory in an external field described in Tavernelli et al.[Phys. Rev. A 81, 052508 (2010)] by calculating on-the-fly pulses to control the population transfer between electronic states using local control theory. Using Tully's fewest switches trajectory surface hopping method, we perform MD to control the photoexcitation of LiF and compare the results to quantum dynamics (QD) calculations performed within the Heidelberg multiconfiguration time-dependent Hartree package. We show that this approach is able to calculate a field that controls the population transfer between electronic states. The calculated field is in good agreement with that obtained from QD, and the differences that arise are discussed in detail.

  6. Ground Movement Analysis Based on Stochastic Medium Theory

    PubMed Central

    Fei, Meng; Li-chun, Wu; Jia-sheng, Zhang; Guo-dong, Deng; Zhi-hui, Ni

    2014-01-01

    In order to calculate the ground movement induced by displacement piles driven into horizontal layered strata, an axisymmetric model was built and then the vertical and horizontal ground movement functions were deduced using stochastic medium theory. Results show that the vertical ground movement obeys normal distribution function, while the horizontal ground movement is an exponential function. Utilizing field measured data, parameters of these functions can be obtained by back analysis, and an example was employed to verify this model. Result shows that stochastic medium theory is suitable for calculating the ground movement in pile driving, and there is no need to consider the constitutive model of soil or contact between pile and soil. This method is applicable in practice. PMID:24701184

  7. Fragment-Based Time-Dependent Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Mosquera, Martín A.; Jensen, Daniel; Wasserman, Adam

    2013-07-01

    Using the Runge-Gross theorem that establishes the foundation of time-dependent density functional theory, we prove that for a given electronic Hamiltonian, choice of initial state, and choice of fragmentation, there is a unique single-particle potential (dubbed time-dependent partition potential) which, when added to each of the preselected fragment potentials, forces the fragment densities to evolve in such a way that their sum equals the exact molecular density at all times. This uniqueness theorem suggests new ways of computing the time-dependent properties of electronic systems via fragment-time-dependent density functional theory calculations. We derive a formally exact relationship between the partition potential and the total density, and illustrate our approach on a simple model system for binary fragmentation in a laser field.

  8. Circuit theory and model-based inference for landscape connectivity

    USGS Publications Warehouse

    Hanks, Ephraim M.; Hooten, Mevin B.

    2013-01-01

    Circuit theory has seen extensive recent use in the field of ecology, where it is often applied to study functional connectivity. The landscape is typically represented by a network of nodes and resistors, with the resistance between nodes a function of landscape characteristics. The effective distance between two locations on a landscape is represented by the resistance distance between the nodes in the network. Circuit theory has been applied to many other scientific fields for exploratory analyses, but parametric models for circuits are not common in the scientific literature. To model circuits explicitly, we demonstrate a link between Gaussian Markov random fields and contemporary circuit theory using a covariance structure that induces the necessary resistance distance. This provides a parametric model for second-order observations from such a system. In the landscape ecology setting, the proposed model provides a simple framework where inference can be obtained for effects that landscape features have on functional connectivity. We illustrate the approach through a landscape genetics study linking gene flow in alpine chamois (Rupicapra rupicapra) to the underlying landscape.

  9. Meaning-based group counseling for bereavement: bridging theory with emerging trends in intervention research.

    PubMed

    MacKinnon, Christopher J; Smith, Nathan Grant; Henry, Melissa; Berish, Mel; Milman, Evgenia; Körner, Annett; Copeland, Laura S; Chochinov, Harvey M; Cohen, S Robin

    2014-01-01

    A growing body of scholarship has evaluated the usefulness of meaning-based theories in the context of bereavement counseling. Although scholars have discussed the application of meaning-based theories for individual practice, there is a lack of inquiry regarding its implications when conducting bereavement support groups. The objective of this article is to bridge meaning-based theories with bereavement group practice, leading to a novel intervention and laying the foundation for future efficacy studies. Building on recommendations specified in the literature, this article outlines the theoretical paradigms and structure of a short-term meaning-based group counseling intervention for uncomplicated bereavement. PMID:24524541

  10. The viscoplasticity theory based on overstress applied to the modeling of a nickel base superalloy at 815 C

    NASA Technical Reports Server (NTRS)

    Krempl, E.; Lu, H.; Yao, D.

    1988-01-01

    Short term strain rate change, creep and relaxation tests were performed in an MTS computer controlled servohydraulic testing machine. Aging and recovery were found to be insignificant for test times not exceeding 30 hrs. The material functions and constants of the theory were identified from results of strain rate change tests. Numerical integration of the theory for relaxation and creep tests showed good predictive capabilities of the viscoplasticity theory based on overstress.

  11. Wireless network traffic modeling based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Liu, Chunfeng; Shu, Yantai; Yang, Oliver W. W.; Liu, Jiakun; Dong, Linfang

    2006-10-01

    In this paper, Extreme Value Theory (EVT) is presented to analyze wireless network traffic. The role of EVT is to allow the development of procedures that are scientifically and statistically rational to estimate the extreme behavior of random processes. There are two primary methods for studying extremes: the Block Maximum (BM) method and the Points Over Threshold (POT) method. By taking limited traffic data that is greater than the threshold value, our experiment and analysis show the wireless network traffic model obtained with the EVT fits well with that of empirical distribution of traffic, thus illustrating that EVT has a good application foreground in the analysis of wireless network traffic.

  12. Game theory based band selection for hyperspectral images

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; He, Zhenyu; Huang, Fengchen

    2015-12-01

    This paper proposes a new evaluation criterion for band selection for hyperspectral imagery. The combination of information and class separability is used to be as a new evaluation criterion, at the same time, the correlation between bands is used as a constraint condition. In addition, the game theory is introduced into the band selection to coordinate the potential conflict of search the optimal band combination using information and class separability these two evaluation criteria. The experimental results show that the proposed method is effective on AVIRIS hyperspectral data.

  13. Interactive Image Segmentation Framework Based On Control Theory

    PubMed Central

    Zhu, Liangjia; Kolesov, Ivan; Karasev, Peter; Tannenbaum, Allen

    2016-01-01

    Segmentation of anatomical structures in medical imagery is a key step in a variety of clinical applications. Designing a generic, automated method that works for various structures and imaging modalities is a daunting task. Instead of proposing a new specific segmentation algorithm, in this paper, we present a general design principle on how to integrate user interactions from the perspective of control theory. In this formulation, Lyapunov stability analysis is employed to design and analyze an interactive segmentation system. The effectiveness and robustness of the proposed method are demonstrated. PMID:26900204

  14. Interactive image segmentation framework based on control theory

    NASA Astrophysics Data System (ADS)

    Zhu, Liangjia; Kolesov, Ivan; Ratner, Vadim; Karasev, Peter; Tannenbaum, Allen

    2015-03-01

    Segmentation of anatomical structures in medical imagery is a key step in a variety of clinical applications. Designing a generic, automated method that works for various structures and imaging modalities is a daunting task. Instead of proposing a new specific segmentation algorithm, in this paper, we present a general design principle on how to integrate user interactions from the perspective of control theory. In this formulation, Lyapunov stability analysis is employed to design an interactive segmentation system. The effectiveness and robustness of the proposed method are demonstrated.

  15. A precepted leadership course based on Bandura's social learning theory.

    PubMed

    Haddock, K S

    1994-01-01

    Transition from student to registered nurse (RN) has long been cited as a difficult time for new graduates entering health care. Bandura's (1977) theory of social learning guided a revision of a nursing leadership course required of baccalaureate student nurses (BSNs) in their final semester. The preceptorship allowed students to work closely with and to practice modeled behaviors of RNs and then receive feedback and reinforcement from both the preceptor and the supervising faculty member. Students were thus prepared to function better in the reality of the practice setting. Positive outcomes were experienced by students, BSN preceptors, faculty, and nurse administrators. PMID:7997295

  16. A theory-based approach to teaching young children about health: A recipe for understanding

    PubMed Central

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

  17. Assembly models for Papovaviridae based on tiling theory

    NASA Astrophysics Data System (ADS)

    Keef, T.; Taormina, A.; Twarock, R.

    2005-09-01

    A vital constituent of a virus is its protein shell, called the viral capsid, that encapsulates and hence provides protection for the viral genome. Assembly models are developed for viral capsids built from protein building blocks that can assume different local bonding structures in the capsid. This situation occurs, for example, for viruses in the family of Papovaviridae, which are linked to cancer and are hence of particular interest for the health sector. More specifically, the viral capsids of the (pseudo-) T = 7 particles in this family consist of pentamers that exhibit two different types of bonding structures. While this scenario cannot be described mathematically in terms of Caspar-Klug theory (Caspar D L D and Klug A 1962 Cold Spring Harbor Symp. Quant. Biol. 27 1), it can be modelled via tiling theory (Twarock R 2004 J. Theor. Biol. 226 477). The latter is used to encode the local bonding environment of the building blocks in a combinatorial structure, called the assembly tree, which is a basic ingredient in the derivation of assembly models for Papovaviridae along the lines of the equilibrium approach of Zlotnick (Zlotnick A 1994 J. Mol. Biol. 241 59). A phase space formalism is introduced to characterize the changes in the assembly pathways and intermediates triggered by the variations in the association energies characterizing the bonds between the building blocks in the capsid. Furthermore, the assembly pathways and concentrations of the statistically dominant assembly intermediates are determined. The example of Simian virus 40 is discussed in detail.

  18. Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory

    ERIC Educational Resources Information Center

    Johnson, David W.; Johnson, Roger T.; Smith, Karl A.

    2014-01-01

    Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…

  19. Applying Ecological Theory to Advance the Science and Practice of School-Based Prejudice Reduction Interventions

    ERIC Educational Resources Information Center

    McKown, Clark

    2005-01-01

    Several school-based racial prejudice-reduction interventions have demonstrated some benefit. Ecological theory serves as a framework within which to understand the limits and to enhance the efficacy of prejudice-reduction interventions. Using ecological theory, this article examines three prejudice-reduction approaches, including social cognitive…

  20. The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation

    ERIC Educational Resources Information Center

    Moorer, Cleamon, Jr.

    2014-01-01

    This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…

  1. Social Learning Theory Parenting Intervention Promotes Attachment-Based Caregiving in Young Children: Randomized Clinical Trial

    ERIC Educational Resources Information Center

    O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

    2013-01-01

    Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social…

  2. Social Learning Theory Parenting Intervention Promotes Attachment-Based Caregiving in Young Children: Randomized Clinical Trial

    ERIC Educational Resources Information Center

    O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

    2013-01-01

    Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social…

  3. A Model of Rater Behavior in Essay Grading Based on Signal Detection Theory

    ERIC Educational Resources Information Center

    DeCarlo, Lawrence T.

    2005-01-01

    An approach to essay grading based on signal detection theory (SDT) is presented. SDT offers a basis for understanding rater behavior with respect to the scoring of construct responses, in that it provides a theory of psychological processes underlying the raters' behavior. The approach also provides measures of the precision of the raters and the…

  4. Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory

    ERIC Educational Resources Information Center

    Johnson, David W.; Johnson, Roger T.; Smith, Karl A.

    2014-01-01

    Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…

  5. Curriculum Design for Junior Life Sciences Based Upon the Theories of Piaget and Skiller. Final Report.

    ERIC Educational Resources Information Center

    Pearce, Ella Elizabeth

    Four seventh grade life science classes, given curriculum materials based upon Piagetian theories of intellectual development and Skinner's theories of secondary reinforcement, were compared with four control classes from the same school districts. Nine students from each class, who(at the pretest) were at the concrete operations stage of…

  6. The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation

    ERIC Educational Resources Information Center

    Moorer, Cleamon, Jr.

    2014-01-01

    This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…

  7. A reactive mobile robot based on a formal theory of action

    SciTech Connect

    Baral, C. Floriano, L.; Gabaldon, A.

    1996-12-31

    One of the agenda behind research in reasoning about actions is to develop autonomous agents (robots) that can act in a dynamic world. The early attempts to use theories of reasoning about actions and planning to formulate a robot control architecture were not successful for several reasons: The early theories based on STRIPS and its extensions allowed only observations about the initial state. A robot control architecture using these theories was usually of the form: (i) make observations (ii) Use the action theory to construct a plan to achieve the goal, and (iii) execute the plan.

  8. Lens-based theory of the Lau effect.

    PubMed

    Kolodziejczyk, A; Jaroszewicz, Z; Henao, R

    2000-04-01

    A new theoretical model of the Lau effect is presented. The transmittance of a diffraction grating can be expressed in an equivalent form as the sum of transmittances of thin cylindrical lenses. Therefore it is possible to explain the Lau effect on the basis of the well-known imaging properties of lenses. According to the given approach, the Lau fringes are created by overlapped images of the first grating that are formed by a set of lenses corresponding to the second grating in the setup. The theory leads to an exhaustive description of the Lau-effect parameters. In particular, one can indicate the shape of the Lau fringes and localize planes of the fringes dependent on the axial distance between gratings and their periods. PMID:10757179

  9. Prediction of planing craft motion based on grey system theory

    NASA Astrophysics Data System (ADS)

    Shen, Jihong; Zhang, Changbin; Chai, Yanyou; Zou, Jin

    2011-06-01

    In order to minimize the harm caused by the instability of a planing craft, a motion prediction model is essential. This paper analyzed the feasibility of using an MGM(1, N) model in grey system theory to predict planing craft motion and carried out the numerical simulation experiment. According to the characteristics of planing craft motion, a recurrence formula was proposed of the parameter matrix of an MGM(1, N) model. Using this formula, data can be updated in real-time without increasing computational complexity significantly. The results of numerical simulation show that using an MGM(1, N) model to predict planing motion is feasible and useful for prediction. So the method proposed in this study can reflect the planing craft motion mechanism successfully, and has rational and effective functions of forecasting and analyzing trends.

  10. Determination of the Sediment Carrying Capacity Based on Perturbed Theory

    PubMed Central

    Ni, Zhi-hui; Zeng, Qiang; Li-chun, Wu

    2014-01-01

    According to the previous studies of sediment carrying capacity, a new method of sediment carrying capacity on perturbed theory was proposed. By taking into account the average water depth, average flow velocity, settling velocity, and other influencing factors and introducing the median grain size as one main influencing factor in deriving the new formula, we established a new sediment carrying capacity formula. The coefficients were determined by the principle of dimensional analysis, multiple linear regression method, and the least square method. After that, the new formula was verified through measuring data of natural rivers and flume tests and comparing the verified results calculated by Cao Formula, Zhang Formula, Li Formula, Engelung-Hansen Formula, Ackers-White Formula, and Yang Formula. According to the compared results, it can be seen that the new method is of high accuracy. It could be a useful reference for the determination of sediment carrying capacity. PMID:25136652

  11. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions. PMID:20978408

  12. Item Response Theory-Based Approaches for Computing Minimum Passing Scores from an Angoff-Based Standard-Setting Study

    ERIC Educational Resources Information Center

    Ferdous, Abdullah A.; Plake, Barbara S.

    2008-01-01

    Even when the scoring of an examination is based on item response theory (IRT), standard-setting methods seldom use this information directly when determining the minimum passing score (MPS) for an examination from an Angoff-based standard-setting study. Often, when IRT scoring is used, the MPS value for a test is converted to an IRT-based theta…

  13. Interlaminar Stresses by Refined Beam Theories and the Sinc Method Based on Interpolation of Highest Derivative

    NASA Technical Reports Server (NTRS)

    Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander

    2010-01-01

    Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.

  14. Theory and practical based approach to chronic total occlusions.

    PubMed

    Sianos, Georgios; Konstantinidis, Nikolaos V; Di Mario, Carlo; Karvounis, Haralambos

    2016-01-01

    Coronary chronic total occlusions (CTOs) represent the most technically challenging lesion subset that interventional cardiologists face. CTOs are identified in up to one third of patients referred for coronary angiography and remain seriously undertreated with percutaneous techniques. The complexity of these procedures and the suboptimal success rates over a long period of time, along with the perception that CTOs are lesions with limited scope for recanalization, account for the underutilization of CTO Percutaneous Coronary Intervention (PCI). During the last years, dedicated groups of experts in Japan, Europe and United States fostered the development and standardization of modern CTO recanalization techniques, achieving success rates far beyond 90 %, while coping with lesions of increasing complexity. Numerous studies support the rationale of CTO revascularization following documentation of viability and ischemia in the territory distal to the CTO. Successful CTO PCI provide better tolerance in case of future acute coronary syndromes and can significantly improve angina and left ventricular function. Randomized trials are on the way to further explore the prognostic benefit of CTO revascularization. The following review reports on the theory and the most recent advances in the field of CTO recanalization, in an attempt to promote a more balanced approach in patients with chronically occluded coronary arteries. PMID:26860695

  15. Improved routing strategy based on gravitational field theory

    NASA Astrophysics Data System (ADS)

    Song, Hai-Quan; Guo, Jin

    2015-10-01

    Routing and path selection are crucial for many communication and logistic applications. We study the interaction between nodes and packets and establish a simple model for describing the attraction of the node to the packet in transmission process by using the gravitational field theory, considering the real and potential congestion of the nodes. On the basis of this model, we propose a gravitational field routing strategy that considers the attractions of all of the nodes on the travel path to the packet. In order to illustrate the efficiency of proposed routing algorithm, we introduce the order parameter to measure the throughput of the network by the critical value of phase transition from a free flow phase to a congested phase, and study the distribution of betweenness centrality and traffic jam. Simulations show that, compared with the shortest path routing strategy, the gravitational field routing strategy considerably enhances the throughput of the network and balances the traffic load, and nearly all of the nodes are used efficiently. Project supported by the Technology and Development Research Project of China Railway Corporation (Grant No. 2012X007-D) and the Key Program of Technology and Development Research Foundation of China Railway Corporation (Grant No. 2012X003-A).

  16. Integrated Models of School-Based Prevention: Logic and Theory

    ERIC Educational Resources Information Center

    Domitrovich, Celene E.; Bradshaw, Catherine P.; Greenberg, Mark T.; Embry, Dennis; Poduska, Jeanne M.; Ialongo, Nicholas S.

    2010-01-01

    School-based prevention programs can positively impact a range of social, emotional, and behavioral outcomes. Yet the current climate of accountability pressures schools to restrict activities that are not perceived as part of the core curriculum. Building on models from public health and prevention science, we describe an integrated approach to…

  17. Learning Trajectory Based Instruction: Toward a Theory of Teaching

    ERIC Educational Resources Information Center

    Sztajn, Paola; Confrey, Jere; Wilson, P. Holt; Edgington, Cynthia

    2012-01-01

    In this article, we propose a theoretical connection between research on learning and research on teaching through recent research on students' learning trajectories (LTs). We define learning trajectory based instruction (LTBI) as teaching that uses students' LTs as the basis for instructional decisions. We use mathematics as the context for our…

  18. A Conceptual Framework Based on Activity Theory for Mobile CSCL

    ERIC Educational Resources Information Center

    Zurita, Gustavo; Nussbaum, Miguel

    2007-01-01

    There is a need for collaborative group activities that promote student social interaction in the classroom. Handheld computers interconnected by a wireless network allow people who work on a common task to interact face to face while maintaining the mediation afforded by a technology-based system. Wirelessly interconnected handhelds open up new…

  19. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  20. Integrated Models of School-Based Prevention: Logic and Theory

    ERIC Educational Resources Information Center

    Domitrovich, Celene E.; Bradshaw, Catherine P.; Greenberg, Mark T.; Embry, Dennis; Poduska, Jeanne M.; Ialongo, Nicholas S.

    2010-01-01

    School-based prevention programs can positively impact a range of social, emotional, and behavioral outcomes. Yet the current climate of accountability pressures schools to restrict activities that are not perceived as part of the core curriculum. Building on models from public health and prevention science, we describe an integrated approach to…

  1. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  2. Non-fragile H? synchronization of memristor-based neural networks using passivity theory.

    PubMed

    Mathiyalagan, K; Anbuvithya, R; Sakthivel, R; Park, Ju H; Prakash, P

    2016-02-01

    In this paper, we formulate and investigate the mixed H? and passivity based synchronization criteria for memristor-based recurrent neural networks with time-varying delays. Some sufficient conditions are obtained to guarantee the synchronization of the considered neural network based on the master-slave concept, differential inclusions theory and Lyapunov-Krasovskii stability theory. Also, the memristive neural network is considered with two different types of memductance functions and two types of gain variations. The results for non-fragile observer-based synchronization are derived in terms of linear matrix inequalities (LMIs). Finally, the effectiveness of the proposed criterion is demonstrated through numerical examples. PMID:26655373

  3. Density functional theory investigations of graphene-based heterostructures

    NASA Astrophysics Data System (ADS)

    Ebnonnasir, Abbas

    Graphene, a two-dimensional single crystal of carbon atoms arranged in a honeycomb lattice, is attractive for applications in nanoelectromechanical devices; in high-performance, low-power electronics, and as transparent electrodes. The present study employs Density Functional Theory (DFT) to identify the atomic and electronic structure of graphene (Gr) on three different types of substrates: transition metals (nickel, palladium), insulators (hBN) and semiconductors (MoS2). Our DFT calculations show that graphene layer on Ni(111) and Ni(110) becomes metallic owing to large binding energies and strong hybridization between nickel and carbon bands. Furthermore, in Gr/Gr/palladium systems, we find that the electrostatic dipoles at the Gr/palladium and Gr/Gr interfaces are oppositely oriented. This leads to a work function of bilayer graphene domains on palladium (111) higher than that of monolayer graphene; the strengths of these dipoles are sensitive to the relative orientation between the two graphene layers and between the graphene and palladium (111). Additionally, the binding energy of graphene on palladium (111) depends on its orientation. We elucidate the physical origin of the effect of growing graphene on hBN/Ni(111) on the binding of hBN to a Ni(111) substrate, and on the electronic properties of hBN. We find that hBN/Ni has two configurational minima, one chemisorbed and one physisorbed, whose properties are not altered when graphene is placed atop hBN. However, a switch from chemisorbed to physisorbed hBN on Ni can occur due to the processing conditions during graphene growth; this switch is solely responsible for changing the hBN layer from metallic to insulating, and not the interactions with graphene. Finally, we find that the relative orientation between graphene and MoS2 layers affects the value and the nature of the bandgap of MoS2, while keeping the electronic structure of graphene unaltered. This relative orientation does not affect the binding energy or the distance between graphene and MoS2 layers. However, it changes the registry between the two layers, which strongly influences the value and type of the bandgap in MoS 2.

  4. A Proxy Signature Scheme Based on Coding Theory

    NASA Astrophysics Data System (ADS)

    Jannati, Hoda; Falahati, Abolfazl

    Proxy signature helps the proxy signer to sign messages on behalf of the original signer. This signature is used when the original signer is not available to sign a specific document. In this paper, we introduce a new proxy signature scheme based on Stern's identification scheme whose security depends on syndrome decoding problem. The proposed scheme is the first code-based proxy signature and can be used in a quantum computer. In this scheme, the operations to perform are linear and very simple thus the signature is performed quickly and can be implemented using smart card in a quite efficient way. The proposed scheme also satisfies unforgeability, undeniability, non-transferability and distinguishability properties which are the security requirements for a proxy signature.

  5. Scale-invariant entropy-based theory for dynamic ordering

    SciTech Connect

    Mahulikar, Shripad P. E-mail: spm@aero.iitb.ac.in; Kumari, Priti

    2014-09-01

    Dynamically Ordered self-organized dissipative structure exists in various forms and at different scales. This investigation first introduces the concept of an isolated embedding system, which embeds an open system, e.g., dissipative structure and its mass and/or energy exchange with its surroundings. Thereafter, scale-invariant theoretical analysis is presented using thermodynamic principles for Order creation, existence, and destruction. The sustainability criterion for Order existence based on its structured mass and/or energy interactions with the surroundings is mathematically defined. This criterion forms the basis for the interrelationship of physical parameters during sustained existence of dynamic Order. It is shown that the sufficient condition for dynamic Order existence is approached if its sustainability criterion is met, i.e., its destruction path is blocked. This scale-invariant approach has the potential to unify the physical understanding of universal dynamic ordering based on entropy considerations.

  6. Transient response of lattice structures based on exact member theory

    NASA Technical Reports Server (NTRS)

    Anderson, Melvin S.

    1989-01-01

    The computer program BUNVIS-RG, which treats vibration and buckling of lattice structures using exact member stiffness matrices, has been extended to calculate the exact modal mass and stiffness quantities that can be used in a conventional transient response analysis based on modes. The exact nature of the development allows inclusion of local member response without introduction of any interior member nodes. Results are given for several problems in which significant interaction between local and global response occurs.

  7. Research on e-learning services based on ontology theory

    NASA Astrophysics Data System (ADS)

    Liu, Rui

    2013-07-01

    E-learning services can realize network learning resource sharing and interoperability, but they can't realize automatic discovery, implementation and integration of services. This paper proposes a framework of e-learning services based on ontology, the ontology technology is applied to the publication and discovery process of e-learning services, in order to realize accurate and efficient retrieval and utilization of e-learning services.

  8. Sensor-Based Collision Avoidance: Theory and Experiments

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun; Steele, Robert; Ivlev, Robert

    1996-01-01

    A new on-line control strategy for sensor-based collision avoidance of manipulators and supporting experimental results are presented in this article. This control strategy is based on nullification of virtual forces applied to the end-effector by a hypothetical spring-plus-damper attached to the object's surface. In the proposed approach, the real-time arm control software continuously monitors the object distance measured by the arm-mounted proximity sensors. When this distance is less than a preset threshold, the collision avoidance control action is initiated to inhibit motion toward the object and thus prevent collision. This is accomplished by employing an outer feedback loop to perturb the end-effector nominal motion trajectory in real-time based on the sensory data. The perturbation is generated by a proportional-plus-integral (PI) collision avoidance controller acting on the difference between the sensed distance and the preset threshold. This approach is computationally very fast, requires minimal modification to the existing manipulator positioning system, and provides the manipulator with an on-line collision avoidance capability to react autonomously and intelligently. A dexterous RRC robotic arm is instrumented with infrared proximity sensors and is operated under the proposed collision avoidance strategy. Experimental results are presented to demonstrate end-effector collision avoidance both with an approaching object and while reaching inside a constricted opening.

  9. Models, strategies, and tools. Theory in implementing evidence-based findings into health care practice.

    PubMed

    Sales, Anne; Smith, Jeffrey; Curran, Geoffrey; Kochevar, Laura

    2006-02-01

    This paper presents a case for careful consideration of theory in planning to implement evidence-based practices into clinical care. As described, theory should be tightly linked to strategic planning through careful choice or creation of an implementation framework. Strategies should be linked to specific interventions and/or intervention components to be implemented, and the choice of tools should match the interventions and overall strategy, linking back to the original theory and framework. The thesis advanced is that in most studies where there is an attempt to implement planned change in clinical processes, theory is used loosely. An example of linking theory to intervention design is presented from a Mental Health Quality Enhancement Research Initiative effort to increase appropriate use of antipsychotic medication among patients with schizophrenia in the Veterans Health Administration. PMID:16637960

  10. An open-shell restricted Hartree-Fock perturbation theory based on symmetric spin orbitals

    NASA Technical Reports Server (NTRS)

    Lee, Timothy J.; Jayatilaka, Dylan

    1993-01-01

    A new open-shell perturbation theory is formulated in terms of symmetric spin orbitals. Only one set of spatial orbitals is required, thereby reducing the number of independent coefficients in the perturbed wavefunctions. For second order, the computational cost is shown to be similar to a closed-shell calculation. This formalism is therefore more efficient than the recently developed RMP, ROMP or RMP-MBPT theories. The perturbation theory described herein was designed to have a close correspondence with our recently proposed coupled-cluster theory based on symmetric spin orbitals. The first-order wavefunction contains contributions from only doubly excited determinants. Equilibrium structures and vibrational frequencies determined from second-order perturbation theory are presented for OH, NH, CH, 02, NH2 and CH2.

  11. Theory of zwitterionic molecular-based organic magnets

    NASA Astrophysics Data System (ADS)

    Shelton, William A.; Aprà, Edoardo; Sumpter, Bobby G.; Saraiva-Souza, Aldilene; Souza Filho, Antonio G.; Nero, Jordan Del; Meunier, Vincent

    2011-08-01

    We describe a class of organic molecular magnets based on zwitterionic molecules (betaine derivatives) possessing donor, ? bridge, and acceptor groups. Using extensive electronic structure calculations we show the electronic ground-state in these systems is magnetic. In addition, we show that the large energy differences computed for the various magnetic states indicate a high Neel temperature. The quantum mechanical nature of the magnetic properties originates from the conjugated ? bridge (only p electrons) in cooperation with the molecular donor-acceptor character. The exchange interactions between electron spin are strong, local, and independent on the length of the ? bridge.

  12. Venture Capital Investment Base on Grey Relational Theory

    NASA Astrophysics Data System (ADS)

    Zhang, Xubo

    This paper builds a venture capital investment projects selection evaluation model base on risk-weight investment return using grey relational analysis. The risk and return in venture capital investment projects selection process is analyses. These risk and return mainly constricted in management ability, operation ability, market ability, exit obtain and investment cost. The 18 sub-indicators are the impact factors contributed to these five evaluation aspects. Grey relation analysis is use to evaluate the venture capital investment selection. Get the optimal solution of risk-weight double objective investment selection evaluation model. An example is used to demonstrate the model in this paper.

  13. Safety models incorporating graph theory based transit indicators.

    PubMed

    Quintero, Liliana; Sayed, Tarek; Wahba, Mohamed M

    2013-01-01

    There is a considerable need for tools to enable the evaluation of the safety of transit networks at the planning stage. One interesting approach for the planning of public transportation systems is the study of networks. Network techniques involve the analysis of systems by viewing them as a graph composed of a set of vertices (nodes) and edges (links). Once the transport system is visualized as a graph, various network properties can be evaluated based on the relationships between the network elements. Several indicators can be calculated including connectivity, coverage, directness and complexity, among others. The main objective of this study is to investigate the relationship between network-based transit indicators and safety. The study develops macro-level collision prediction models that explicitly incorporate transit physical and operational elements and transit network indicators as explanatory variables. Several macro-level (zonal) collision prediction models were developed using a generalized linear regression technique, assuming a negative binomial error structure. The models were grouped into four main themes: transit infrastructure, transit network topology, transit route design, and transit performance and operations. The safety models showed that collisions were significantly associated with transit network properties such as: connectivity, coverage, overlapping degree and the Local Index of Transit Availability. As well, the models showed a significant relationship between collisions and some transit physical and operational attributes such as the number of routes, frequency of routes, bus density, length of bus and 3+ priority lanes. PMID:22831497

  14. [Training -- competency-based education -- learning theory and practice].

    PubMed

    Breuer, Georg

    2013-11-01

    A lifelong learning process is necessarily the basis for the specialization and expertise in the field of anesthesiology. Thus competency as a physician is a complex, multidimensional construction of knowledge, skills and attitudes to be able to solve and persist the complex daily work challenges in a flexible and responsible way. Experts therefore showflexible and intuitive capabilities in pursuing their profession. Accordingly modern competency based learning objectives are very helpful. The DGAI Commission for “Further Education” already thought ahead in defining a competencybased curriculum for the specialization in the field of anesthesiology and could be integrated into the frameworks of the German Medical Association. In addition to the curricular framework elements of assessment are necessary. A single oral exam is consequently not representative for different levels of competencies. However, there is beside the responsibility of the learners for their learning processalso a high obligation of the clinical teachers to attend the learning process and to ensure a positive learning atmosphere with scope for feedback. Some competencies potentially could be better learned in a “sheltered” room based on simulation outside the OR, for example to train rare incidents or emergency procedures. In general there should be ongoing effort to enhance the process of expertise development, also in context of patient safety and quality management. PMID:24343144

  15. Tire grip identification based on strain information: Theory and simulations

    NASA Astrophysics Data System (ADS)

    Carcaterra, A.; Roveri, N.

    2013-12-01

    A novel technique for the identification of the tire-road grip conditions is presented. This is based on the use of strain information inside the tire, from which relevant characteristics of the tire-road contact can be extracted also through a factor named area slip ratio. This process forms the basis of a technology for grip identification that requires a new model of the tire dynamics. The model permits to determine closed form analytical relationships between the measured strain and the area slip ratio. On this basis, a procedure that can extract the contact kinematic parameter from the time history of the internal strain of the rolling tire is presented. Numerical simulations offer the chance to validate the identification algorithm.

  16. Game Theory Based Trust Model for Cloud Environment.

    PubMed

    Gokulnath, K; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  17. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  18. Energy-Efficiency Analysis of a Distributed Queuing Medium Access Control Protocol for Biomedical Wireless Sensor Networks in Saturation Conditions

    PubMed Central

    Otal, Begonya; Alonso, Luis; Verikoukis, Christos

    2011-01-01

    The aging population and the high quality of life expectations in our society lead to the need of more efficient and affordable healthcare solutions. For this reason, this paper aims for the optimization of Medium Access Control (MAC) protocols for biomedical wireless sensor networks or wireless Body Sensor Networks (BSNs). The hereby presented schemes always have in mind the efficient management of channel resources and the overall minimization of sensors’ energy consumption in order to prolong sensors’ battery life. The fact that the IEEE 802.15.4 MAC does not fully satisfy BSN requirements highlights the need for the design of new scalable MAC solutions, which guarantee low-power consumption to the maximum number of body sensors in high density areas (i.e., in saturation conditions). In order to emphasize IEEE 802.15.4 MAC limitations, this article presents a detailed overview of this de facto standard for Wireless Sensor Networks (WSNs), which serves as a link for the introduction and initial description of our here proposed Distributed Queuing (DQ) MAC protocol for BSN scenarios. Within this framework, an extensive DQ MAC energy-consumption analysis in saturation conditions is presented to be able to evaluate its performance in relation to IEEE 802.5.4 MAC in highly dense BSNs. The obtained results show that the proposed scheme outperforms IEEE 802.15.4 MAC in average energy consumption per information bit, thus providing a better overall performance that scales appropriately to BSNs under high traffic conditions. These benefits are obtained by eliminating back-off periods and collisions in data packet transmissions, while minimizing the control overhead. PMID:22319351

  19. The Effect Of The Materials Based On Multiple Intelligence Theory Upon The Intelligence Groups' Learning Process

    NASA Astrophysics Data System (ADS)

    Oral, I.; Dogan, O.

    2007-04-01

    The aim of this study is to find out the effect of the course materials based on Multiple Intelligence Theory upon the intelligence groups' learning process. In conclusion, the results proved that the materials prepared according to Multiple Intelligence Theory have a considerable effect on the students' learning process. This effect was particularly seen on the student groups of the musical-rhythmic, verbal-linguistic, interpersonal-social and naturalist intelligence.

  20. Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory

    NASA Astrophysics Data System (ADS)

    Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui

    The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.

  1. Gauge Theory of Supergravity Based Only on a Self-Dual Spin Connection

    NASA Astrophysics Data System (ADS)

    Nieto, J. A.; Socorro, J.; Obregón, O.

    1996-05-01

    A gauge theory of supergravity is constructed based only on the supersymmetric self-dual spin connection associated to the supergroup OSp \\(1 \\| 4\\). We show that Jacobson's supergravity action arises naturally from our proposed action. It is formulated by taking the self-dual part of the MacDowell-Mansouri gauge theory of supergravity. In this sense, our quadratic action in the supersymmetric self-dual curvature tensor provides a relation between these two important previous extensions of supergravity.

  2. Microfluidic, Bead-Based Assay: Theory and Experiments

    PubMed Central

    Thompson, Jason A.; Bau, Haim H.

    2009-01-01

    Microbeads are frequently used as a solid support for biomolecules such as proteins and nucleic acids in heterogeneous microfluidic assays. However, relatively few studies investigate the binding kinetics on modified bead surfaces in a microfluidics context. In this study, a customized hot embossing technique is used to stamp microwells in a thin plastic substrate where streptavidin-coated agarose beads are selectively placed and subsequently immobilized within a conduit. Biotinylated quantum dots are used as a label to monitor target analyte binding to the bead's surface. Three-dimensional finite element simulations are carried out to model the binding kinetics on the bead's surface. The model accounts for surface exclusion effects resulting from a single quantum dot occluding multiple receptor sites. The theoretical predictions are compared and favorably agree with experimental observations. The theoretical simulations provide a useful tool to predict how varying parameters affect microbead reaction kinetics and sensor performance. This study enhances our understanding of bead-based microfluidic assays and provides a design tool for developers of point-of-care, lab-on-chip devices for medical diagnosis, food and water quality inspection, and environmental monitoring. PMID:19766545

  3. Controlling Retrieval during Practice: Implications for Memory-Based Theories of Automaticity

    ERIC Educational Resources Information Center

    Wilkins, Nicolas J.; Rawson, Katherine A.

    2011-01-01

    Memory-based processing theories of automaticity assume that shifts from algorithmic to retrieval-based processing underlie practice effects on response times. The current work examined the extent to which individuals can exert control over the involvement of retrieval during skill acquisition and the factors that may influence control. In two…

  4. Score Reliability of a Test Composed of Passage-Based Testlets: A Generalizability Theory Perspective.

    ERIC Educational Resources Information Center

    Lee, Yong-Won

    The purpose of this study was to investigate the impact of local item dependence (LID) in passage-based testlets on the test score reliability of an English as a Foreign Language (EFL) reading comprehension test from the perspective of generalizability (G) theory. Definitions and causes of LID in passage-based testlets are reviewed within the…

  5. Using Game Theory and Competition-Based Learning to Stimulate Student Motivation and Performance

    ERIC Educational Resources Information Center

    Burguillo, Juan C.

    2010-01-01

    This paper introduces a framework for using Game Theory tournaments as a base to implement Competition-based Learning (CnBL), together with other classical learning techniques, to motivate the students and increase their learning performance. The paper also presents a description of the learning activities performed along the past ten years of a…

  6. Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment

    ERIC Educational Resources Information Center

    Chen, Jing

    2012-01-01

    Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…

  7. A Comparison of Measurement Equivalence Methods Based on Confirmatory Factor Analysis and Item Response Theory.

    ERIC Educational Resources Information Center

    Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.

    Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…

  8. Constraint-Based Modeling: From Cognitive Theory to Computer Tutoring--and Back Again

    ERIC Educational Resources Information Center

    Ohlsson, Stellan

    2016-01-01

    The ideas behind the constraint-based modeling (CBM) approach to the design of intelligent tutoring systems (ITSs) grew out of attempts in the 1980's to clarify how declarative and procedural knowledge interact during skill acquisition. The learning theory that underpins CBM was based on two conceptual innovations. The first innovation was to…

  9. An Instructional Design Theory for Interactions in Web-Based Learning Environments.

    ERIC Educational Resources Information Center

    Lee, Miyoung; Paulus, Trena

    This study developed and formatively evaluated an instructional design theory to guide designers in selecting when and how to utilize interactions as instructional methods in a Web-based distance learning higher education environment. Research questions asked: What are the types and outcomes of interactions between participants in a Web-based…

  10. Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment

    ERIC Educational Resources Information Center

    Chen, Jing

    2012-01-01

    Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…

  11. Using Game Theory and Competition-Based Learning to Stimulate Student Motivation and Performance

    ERIC Educational Resources Information Center

    Burguillo, Juan C.

    2010-01-01

    This paper introduces a framework for using Game Theory tournaments as a base to implement Competition-based Learning (CnBL), together with other classical learning techniques, to motivate the students and increase their learning performance. The paper also presents a description of the learning activities performed along the past ten years of a…

  12. An Approach for Leukemia Classification Based on Cooperative Game Theory

    PubMed Central

    Torkaman, Atefeh; Charkari, Nasrollah Moghaddam; Aghaeipour, Mahnaz

    2011-01-01

    Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO). Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5) with (90.16%) in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment of leukemic patients. PMID:21988887

  13. Cost performance satellite design using queueing theory. [via digital simulation

    NASA Technical Reports Server (NTRS)

    Hein, G. F.

    1975-01-01

    A modified Poisson arrival, infinite server queuing model is used to determine the effects of limiting the number of broadcast channels (C) of a direct broadcast satellite used for public service purposes (remote health care, education, etc.). The model is based on the reproductive property of the Poisson distribution. A difference equation has been developed to describe the change in the Poisson parameter. When all initially delayed arrivals reenter the system a (C plus 1) order polynomial must be solved to determine the effective value of the Poisson parameter. When less than 100% of the arrivals reenter the system the effective value must be determined by solving a transcendental equation. The model was used to determine the minimum number of channels required for a disaster warning satellite without degradation in performance. Results predicted by the queuing model were compared with the results of digital simulation.

  14. The Cultures of Contemporary Instructional Design Scholarship, Part Two: Developments Based on Constructivist and Critical Theory Foundations

    ERIC Educational Resources Information Center

    Willis, Jerry

    2011-01-01

    This article is the second in a series (see Willis, 2011) that looks at the current status of instructional design scholarship and theory. In this concluding article, the focus is on two cultures of ID work, one based on constructivist and interpretivist theory and the other based on critical theory and critical pedagogy. There are distinct…

  15. Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory

    NASA Astrophysics Data System (ADS)

    Matsumura, Koki; Kawamoto, Masaru

    This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.

  16. Social learning theory parenting intervention promotes attachment-based caregiving in young children: randomized clinical trial.

    PubMed

    O'Connor, Thomas G; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

    2013-01-01

    Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social learning theory-based treatment promoted change in qualities of parent-child relationship derived from attachment theory. A randomized clinical trial of 174 four- to six-year-olds selected from a high-need urban area and stratified by conduct problems were assigned to a parenting program plus a reading intervention (n = 88) or nonintervention condition (n = 86). In-home observations of parent-child interactions were assessed in three tasks: (a) free play, (b) challenge task, and (c) tidy up. Parenting behavior was coded according to behavior theory using standard count measures of positive and negative parenting, and for attachment theory using measures of sensitive responding and mutuality; children's attachment narratives were also assessed. Compared to the parents in the nonintervention group, parents allocated to the intervention showed increases in the positive behavioral counts and sensitive responding; change in behavioral count measures overlapped modestly with change in attachment-based changes. There was no reliable change in children's attachment narratives associated with the intervention. The findings demonstrate that standard social learning theory-based parenting interventions can change broader aspects of parent-child relationship quality and raise clinical and conceptual questions about the distinctiveness of existing treatment models in parenting research. PMID:23020146

  17. Research on Flow Shift Law of Porous Media in Goaf Base on the Unsteady Airflow Theory

    NASA Astrophysics Data System (ADS)

    Li, Y. C.; Lin, A. H.; Wang, H. Q.; Zou, S. H.

    Base on the unsteady flow theory, supported by the theory of mine ventilation, the theory of fluid mechanics, the theory of infiltration flow through porous media, mathematical model of porous media about airflow fluctuating in goaf of mine is establishedfeatures of distribution of flow field in goaf when airflow fluctuating are researched by numerical simulation, distribution of flow field is tested with the help of the flow field testing experiment model which is designed and done by oneself own. The results show: the results of numerical simulation and experiment are quite corresponding, mathematical model of flow field of porous media in goaf of mine established in the paper can be used to research distribution of flow field in goaf and flow shift law.

  18. Research on Prediction Model of Time Series Based on Fuzzy Theory and Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Xiao-qin, Wu

    Fuzzy theory is one of the newly adduced self-adaptive strategies,which is applied to dynamically adjust the parameters o genetic algorithms for the purpose of enhancing the performance.In this paper, the financial time series analysis and forecasting as the main case study to the theory of soft computing technology framework that focuses on the fuzzy theory and genetic algorithms(FGA) as a method of integration. the financial time series forecasting model based on fuzzy theory and genetic algorithms was built. the ShangZheng index cards as an example. The experimental results show that FGA perform s much better than BP neural network, not only in the precision, but also in the searching speed.The hybrid algorithm has a strong feasibility and superiority.

  19. Perturbation theory for multicomponent fluids based on structural properties of hard-sphere chain mixtures

    NASA Astrophysics Data System (ADS)

    Hlushak, Stepan

    2015-09-01

    An analytical expression for the Laplace transform of the radial distribution function of a mixture of hard-sphere chains of arbitrary segment size and chain length is used to rigorously formulate the first-order Barker-Henderson perturbation theory for the contribution of the segment-segment dispersive interactions into thermodynamics of the Lennard-Jones chain mixtures. Based on this approximation, a simple variant of the statistical associating fluid theory is proposed and used to predict properties of several mixtures of chains of different lengths and segment sizes. The theory treats the dispersive interactions more rigorously than the conventional theories and provides means for more accurate description of dispersive interactions in the mixtures of highly asymmetric components.

  20. Ground and Excited States of Retinal Schiff Base Chromophores by Multiconfigurational Perturbation Theory

    PubMed Central

    Sekharan, Sivakumar; Weingart, Oliver; Buss, Volker

    2006-01-01

    We have studied the wavelength dependence of retinal Schiff base absorbencies on the protonation state of the chromophore at the multiconfigurational level of theory using second order perturbation theory (CASPT2) within an atomic natural orbital basis set on MP2 optimized geometries. Quantitative agreement between calculated and experimental absorption maxima was obtained for protonated and deprotonated Schiff bases of all-trans- and 11-cis-retinal and intermediate states covering a wavelength range from 610 to 353 nm. These data will be useful as reference points for the calibration of more approximate schemes. PMID:16648170

  1. Enhancing quality of practice through theory of change-based evaluation: science or practice?

    PubMed

    Julian, David A

    2005-06-01

    This paper describes the evaluation component of Partnerships for Success (PfS), a comprehensive community effort designed to address youth development issues. The evaluation component is referred to as "theory of change-based evaluation." The author considers the implications of applying community practice tools such as theory of change-based evaluation to the current conceptualization of community science. More specifically, the author argues that the current conceptualization of community science pays scant attention to community practice. This paper concludes by suggesting that the current conceptualization of community science be modified to recognize the importance of community practice as an equal aspiration for community psychologists. PMID:15909792

  2. Theory of normal and superconducting properties of fullerene-based solids

    SciTech Connect

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a standard model'' of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.

  3. Theory of normal and superconducting properties of fullerene-based solids

    SciTech Connect

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a ``standard model`` of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.

  4. The Scientific Value of Cognitive Load Theory: A Research Agenda Based on the Structuralist View of Theories

    ERIC Educational Resources Information Center

    Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele

    2009-01-01

    In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…

  5. The Scientific Value of Cognitive Load Theory: A Research Agenda Based on the Structuralist View of Theories

    ERIC Educational Resources Information Center

    Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele

    2009-01-01

    In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…

  6. A Theory-based Approach to the Measurement of Foreign Language Learning Ability: The CANAL-F Theory and Test.

    ERIC Educational Resources Information Center

    Grigorenko, Elena L.; Sternberg, Robert J.; Ehrman, Madeline E.

    2000-01-01

    Presents a rationale, description, and partial construct validation of a new theory of foreign language aptitude: CANAL-F--Cognitive Ability for Novelty in Acquisition of Language (foreign). The theory was applied and implemented in a test of foreign language aptitude (CANAL-FT). Outlines the CANAL-F theory and details of its instrumentation…

  7. Power optimization of chemically driven heat engine based on first and second order reaction kinetic theory and probability theory

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Chen, Lingen; Sun, Fengrui

    2016-03-01

    The finite-time thermodynamic method based on probability analysis can more accurately describe various performance parameters of thermodynamic systems. Based on the relation between optimal efficiency and power output of a generalized Carnot heat engine with a finite high-temperature heat reservoir (heat source) and an infinite low-temperature heat reservoir (heat sink) and with the only irreversibility of heat transfer, this paper studies the problem of power optimization of chemically driven heat engine based on first and second order reaction kinetic theory, puts forward a model of the coupling heat engine which can be run periodically and obtains the effects of the finite-time thermodynamic characteristics of the coupling relation between chemical reaction and heat engine on the power optimization. The results show that the first order reaction kinetics model can use fuel more effectively, and can provide heat engine with higher temperature heat source to increase the power output of the heat engine. Moreover, the power fluctuation bounds of the chemically driven heat engine are obtained by using the probability analysis method. The results may provide some guidelines for the character analysis and power optimization of the chemically driven heat engines.

  8. Paying for Express Checkout: Competition and Price Discrimination in Multi-Server Queuing Systems

    PubMed Central

    Deck, Cary; Kimbrough, Erik O.; Mongrain, Steeve

    2014-01-01

    We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809

  9. Paying for express checkout: competition and price discrimination in multi-server queuing systems.

    PubMed

    Deck, Cary; Kimbrough, Erik O; Mongrain, Steeve

    2014-01-01

    We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809

  10. Studying thin film damping in a micro-beam resonator based on non-classical theories

    NASA Astrophysics Data System (ADS)

    Ghanbari, Mina; Hossainpour, Siamak; Rezazadeh, Ghader

    2015-09-01

    In this paper, a mathematical model is presented for studying thin film damping of the surrounding fluid in an in-plane oscillating micro-beam resonator. The proposed model for this study is made up of a clamped-clamped micro-beam bound between two fixed layers. The micro-gap between the micro-beam and fixed layers is filled with air. As classical theories are not properly capable of predicting the size dependence behaviors of the micro-beam, and also behavior of micro-scale fluid media, hence in the presented model, equation of motion governing longitudinal displacement of the micro-beam has been extracted based on non-local elasticity theory. Furthermore, the fluid field has been modeled based on micro-polar theory. These coupled equations have been simplified using Newton-Laplace and continuity equations. After transforming to non-dimensional form and linearizing, the equations have been discretized and solved simultaneously using a Galerkin-based reduced order model. Considering slip boundary conditions and applying a complex frequency approach, the equivalent damping ratio and quality factor of the micro-beam resonator have been obtained. The obtained values for the quality factor have been compared to those based on classical theories. We have shown that applying non-classical theories underestimate the values of the quality factor obtained based on classical theories. The effects of geometrical parameters of the micro-beam and micro-scale fluid field on the quality factor of the resonator have also been investigated.

  11. Mixture theory-based poroelasticity as a model of interstitial tissue growth

    PubMed Central

    Cowin, Stephen C.; Cardoso, Luis

    2011-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  12. Operationalizing Levels of Academic Mastery Based on Vygotsky's Theory: The Study of Mathematical Knowledge

    ERIC Educational Resources Information Center

    Nezhnov, Peter; Kardanova, Elena; Vasilyeva, Marina; Ludlow, Larry

    2015-01-01

    The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky's theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was…

  13. Operationalizing Levels of Academic Mastery Based on Vygotsky's Theory: The Study of Mathematical Knowledge

    ERIC Educational Resources Information Center

    Nezhnov, Peter; Kardanova, Elena; Vasilyeva, Marina; Ludlow, Larry

    2015-01-01

    The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky's theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was…

  14. Theory and Utility-Key Themes in Evidence-Based Assessment: Comment on the Special Section

    ERIC Educational Resources Information Center

    McFall, Richard M.

    2005-01-01

    This article focuses on two key themes in the four featured reviews on evidence-based assessment. The first theme is the essential role of theory in psychological assessment. An overview of this complex, multilayered role is presented. The second theme is the need for a common metric with which to gauge the utility of specific psychological tests…

  15. Predicting Study Abroad Intentions Based on the Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Schnusenberg, Oliver; de Jong, Pieter; Goel, Lakshmi

    2012-01-01

    The emphasis on study abroad programs is growing in the academic context as U.S. based universities seek to incorporate a global perspective in education. Using a model that has underpinnings in the theory of planned behavior (TPB), we predict students' intention to participate in short-term study abroad program. We use TPB to identify behavioral,…

  16. Four-dimensional topological quantum field theory, Hopf categories, and the canonical bases

    NASA Astrophysics Data System (ADS)

    Crane, Louis; Frenkel, Igor B.

    1994-10-01

    A new combinatorial method of constructing four-dimensional topological quantum field theories is proposed. The method uses a new type of algebraic structure called a Hopf category. The construction of a family of Hopf categories related to the quantum groups and their canonical bases is also outlined.

  17. A Practice-Based Theory of Professional Education: Teach For America's Professional Development Model

    ERIC Educational Resources Information Center

    Gabriel, Rachael

    2011-01-01

    In 1999, Ball and Cohen proposed a practice-based theory of professional education, which would end inadequate professional development efforts with a more comprehensive approach. Their work has been referenced over the past decade, yet there have been limited attempts to actualize their ideals and research their implications. In this article, I…

  18. Portuguese Public University Student Satisfaction: A Stakeholder Theory-Based Approach

    ERIC Educational Resources Information Center

    Mainardes, Emerson; Alves, Helena; Raposo, Mario

    2013-01-01

    In accordance with the importance of the student stakeholder to universities, the objective of this research project was to evaluate student satisfaction at Portuguese public universities as regards their self-expressed core expectations. The research was based both on stakeholder theory itself and on previous studies of university stakeholders.…

  19. Effects of Guided Writing Strategies on Students' Writing Attitudes Based on Media Richness Theory

    ERIC Educational Resources Information Center

    Lan, Yu-Feng; Hung, Chun-Ling; Hsu, Hung-Ju

    2011-01-01

    The purpose of this paper is to develop different guided writing strategies based on media richness theory and further evaluate the effects of these writing strategies on younger students' writing attitudes in terms of motivation, enjoyment and anxiety. A total of 66 sixth-grade elementary students with an average age of twelve were invited to…

  20. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  1. A Theory-based Faculty Development Program for Clinician-Educators.

    ERIC Educational Resources Information Center

    Hewson, Mariana G.

    2000-01-01

    Describes development, implementation, and evaluation of a theory-based faculty development program for physician-educators in medicine and pediatrics at the Cleveland Clinic (Ohio). The program includes a 12-hour course focused on precepting skills, bedside teaching, and effective feedback; on-site coaching; and innovative projects in clinical…

  2. Evidence-Based Practice in Kinesiology: The Theory to Practice Gap Revisited

    ERIC Educational Resources Information Center

    Knudson, Duane

    2005-01-01

    As evidence-based practice sweeps the applied health professions, it is a good time to evaluate the generation of knowledge in Kinesiology and its transmission to professionals and the public. Knowledge transmission has been debated in the past from the perspectives of the theory-to-practice gap and the discipline versus profession emphasis.…

  3. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Fein, Lance; Jones, Don

    2015-01-01

    This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

  4. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  5. Science Teaching Based on Cognitive Load Theory: Engaged Students, but Cognitive Deficiencies

    ERIC Educational Resources Information Center

    Meissner, Barbara; Bogner, Franz X.

    2012-01-01

    To improve science learning under demanding conditions, we designed an out-of-school lesson in compliance with cognitive load theory (CLT). We extracted student clusters based on individual effectiveness, and compared instructional efficiency, mental effort, and persistence of learning. The present study analyses students' engagement. 50.0% of our…

  6. Imitation dynamics of vaccine decision-making behaviours based on the game theory.

    PubMed

    Yang, Junyuan; Martcheva, Maia; Chen, Yuming

    2016-01-01

    Based on game theory, we propose an age-structured model to investigate the imitation dynamics of vaccine uptake. We first obtain the existence and local stability of equilibria. We show that Hopf bifurcation can occur. We also establish the global stability of the boundary equilibria and persistence of the disease. The theoretical results are supported by numerical simulations. PMID:26536171

  7. Applying Unidimensional and Multidimensional Item Response Theory Models in Testlet-Based Reading Assessment

    ERIC Educational Resources Information Center

    Min, Shangchao; He, Lianzhen

    2014-01-01

    This study examined the relative effectiveness of the multidimensional bi-factor model and multidimensional testlet response theory (TRT) model in accommodating local dependence in testlet-based reading assessment with both dichotomously and polytomously scored items. The data used were 14,089 test-takers' item-level responses to the…

  8. Social Theory, Sacred Text, and Sing-Sing Prison: A Sociology of Community-Based Reconciliation.

    ERIC Educational Resources Information Center

    Erickson, Victoria Lee

    2002-01-01

    Examines the sociological component of the urban community-based professional education programs at New York Theological Seminary offered at Sing-Sing Prison. Explores the simultaneous use of social theory and sacred texts as teaching tools and intervention strategies in the educational and personal transformation processes of men incarcerated for…

  9. Examining Instruction in MIDI-Based Composition through a Critical Theory Lens

    ERIC Educational Resources Information Center

    Louth, Paul

    2013-01-01

    This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…

  10. Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders

    ERIC Educational Resources Information Center

    Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel

    2012-01-01

    This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…

  11. Preparing Students for Education, Work, and Community: Activity Theory in Task-Based Curriculum Design

    ERIC Educational Resources Information Center

    Campbell, Chris; MacPherson, Seonaigh; Sawkins, Tanis

    2014-01-01

    This case study describes how sociocultural and activity theory were applied in the design of a publicly funded, Canadian Language Benchmark (CLB)-based English as a Second Language (ESL) credential program and curriculum for immigrant and international students in postsecondary institutions in British Columbia, Canada. The ESL Pathways Project…

  12. Examining Instruction in MIDI-Based Composition through a Critical Theory Lens

    ERIC Educational Resources Information Center

    Louth, Paul

    2013-01-01

    This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…

  13. Portuguese Public University Student Satisfaction: A Stakeholder Theory-Based Approach

    ERIC Educational Resources Information Center

    Mainardes, Emerson; Alves, Helena; Raposo, Mario

    2013-01-01

    In accordance with the importance of the student stakeholder to universities, the objective of this research project was to evaluate student satisfaction at Portuguese public universities as regards their self-expressed core expectations. The research was based both on stakeholder theory itself and on previous studies of university stakeholders.…

  14. From Theory to Practice: Concept-Based Inquiry in a High School Art Classroom

    ERIC Educational Resources Information Center

    Walker, Margaret A.

    2014-01-01

    This study examines what an emerging educational theory looks like when put into practice in an art classroom. It explores the teaching methodology of a high school art teacher who has utilized concept-based inquiry in the classroom to engage his students in artmaking and analyzes the influence this methodology has had on his adolescent students.…

  15. Applications of Cognitive Load Theory to Multimedia-Based Foreign Language Learning: An Overview

    ERIC Educational Resources Information Center

    Chen, I-Jung; Chang, Chi-Cheng; Lee, Yen-Chang

    2009-01-01

    This article reviews the multimedia instructional design literature based on cognitive load theory (CLT) in the context of foreign language learning. Multimedia are of particular importance in language learning materials because they incorporate text, image, and sound, thus offering an integrated learning experience of the four language skills…

  16. Item Response Theory with Estimation of the Latent Population Distribution Using Spline-Based Densities

    ERIC Educational Resources Information Center

    Woods, Carol M.; Thissen, David

    2006-01-01

    The purpose of this paper is to introduce a new method for fitting item response theory models with the latent population distribution estimated from the data using splines. A spline-based density estimation system provides a flexible alternative to existing procedures that use a normal distribution, or a different functional form, for the…

  17. Revisiting Transactional Distance Theory in a Context of Web-Based High-School Distance Education

    ERIC Educational Resources Information Center

    Murphy, Elizabeth Anne; Rodriguez-Manzanares, Maria Angeles

    2008-01-01

    The purpose of this paper is to report on a study that provided an opportunity to consider Transactional Distance Theory (TDT) in a current technology context of web-based learning in distance education (DE), high-school classrooms. Data collection relied on semi-structured interviews conducted with 22 e-teachers and managers in Newfoundland and…

  18. Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders

    ERIC Educational Resources Information Center

    Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel

    2012-01-01

    This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…

  19. Theory Presentation and Assessment in a Problem-Based Learning Group.

    ERIC Educational Resources Information Center

    Glenn, Phillip J.; Koschmann, Timothy; Conlee, Melinda

    A study used conversational analysis to examine the reasoning students use in a Problem-Based Learning (PBL) environment as they formulate a theory (in medical contexts, a diagnosis) which accounts for evidence (medical history and symptoms). A videotaped group interaction was analyzed and transcribed. In the segment of interaction examined, the…

  20. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  1. Investigating Acceptance toward Mobile Learning to Assist Individual Knowledge Management: Based on Activity Theory Approach

    ERIC Educational Resources Information Center

    Liaw, Shu-Sheng; Hatala, Marek; Huang, Hsiu-Mei

    2010-01-01

    Mobile devices could facilitate human interaction and access to knowledge resources anytime and anywhere. With respect to wide application possibilities of mobile learning, investigating learners' acceptance towards it is an essential issue. Based on activity theory approach, this research explores positive factors for the acceptance of m-learning…

  2. Poverty Lines Based on Fuzzy Sets Theory and Its Application to Malaysian Data

    ERIC Educational Resources Information Center

    Abdullah, Lazim

    2011-01-01

    Defining the poverty line has been acknowledged as being highly variable by the majority of published literature. Despite long discussions and successes, poverty line has a number of problems due to its arbitrary nature. This paper proposes three measurements of poverty lines using membership functions based on fuzzy set theory. The three…

  3. Application of Online Multimedia Courseware in College English Teaching Based on Constructivism Theory

    ERIC Educational Resources Information Center

    Li, Zhenying

    2012-01-01

    Based on Constructivism Theory, this paper aims to investigate the application of online multimedia courseware to college English teaching. By making experiments and students' feedback, some experience has been accumulated, and some problems are discovered and certain revelations are acquired as well in English teaching practice, which pave the…

  4. Aligning Theory and Web-Based Instructional Design Practice with Design Patterns.

    ERIC Educational Resources Information Center

    Frizell, Sherri S.; Hubscher, Roland

    Designing instructionally sound Web courses is a difficult task for instructors who lack experience in interaction and Web-based instructional design. Learning theories and instructional strategies can provide course designers with principles and design guidelines associated with effective instruction that can be utilized in the design of…

  5. Investigating Acceptance toward Mobile Learning to Assist Individual Knowledge Management: Based on Activity Theory Approach

    ERIC Educational Resources Information Center

    Liaw, Shu-Sheng; Hatala, Marek; Huang, Hsiu-Mei

    2010-01-01

    Mobile devices could facilitate human interaction and access to knowledge resources anytime and anywhere. With respect to wide application possibilities of mobile learning, investigating learners' acceptance towards it is an essential issue. Based on activity theory approach, this research explores positive factors for the acceptance of m-learning…

  6. Effects of a Theory-Based, Peer-Focused Drug Education Course.

    ERIC Educational Resources Information Center

    Gonzalez, Gerardo M.

    1990-01-01

    Describes innovative, theory-based, peer-focused college drug education academic course and its effect on perceived levels of risk associated with the use of alcohol, marijuana, and cocaine. Evaluation of the effects of the course indicated the significant effect on perceived risk of cocaine, but not alcohol or marijuana. (Author/ABL)

  7. English Textbooks Based on Research and Theory--A Possible Dream.

    ERIC Educational Resources Information Center

    Suhor, Charles

    1984-01-01

    Research based text materials will probably never dominate the textbook market. To begin with, translating theory and research into practice is a chancy business. There are also creative problems such as the inherent oversimplification involved in textbook writing. Every textbook writer who has been a classroom teacher will acknowledge that such…

  8. Critically Evaluating Competing Theories: An Exercise Based on the Kitty Genovese Murder

    ERIC Educational Resources Information Center

    Sagarin, Brad J.; Lawler-Sagarin, Kimberly A.

    2005-01-01

    We describe an exercise based on the 1964 murder of Catherine Genovese--a murder observed by 38 witnesses, none of whom called the police. Students read a summary of the murder and worked in small groups to design an experiment to test the competing theories for the inaction of the witnesses (Americans' selfishness and insensitivity vs. diffusion…

  9. Interpretation-Based Processing: A Unified Theory of Semantic Sentence Comprehension

    ERIC Educational Resources Information Center

    Budiu, Raluca; Anderson, John R.

    2004-01-01

    We present interpretation-based processing--a theory of sentence processing that builds a syntactic and a semantic representation for a sentence and assigns an interpretation to the sentence as soon as possible. That interpretation can further participate in comprehension and in lexical processing and is vital for relating the sentence to the…

  10. Decibel error test and flow law of multiphase rocks based on energy dissipation theory [rapid communication

    NASA Astrophysics Data System (ADS)

    Jiang, Yan; Zang, Shaoxian; Wei, Rongqiang

    2005-06-01

    A new flow law is developed based on energy dissipation theory, which is independent of bound theory. The influence of the distribution of constituent minerals on the rheological behavior of the rock is taken into account and expressed as an additional continuity condition in the theory instead of the lower and upper bounds in bound theory. With the continuity equations, the prediction of the new flow law turns into the question of valuing the extremum of energy dissipation function during the process of the steady-state creep. To make the new flow law mathematically self-consistent, the demonstration of the uniqueness of the extremum which must be a minimum is performed. After pointing out the limitation of relative error, we introduce a new criterion-decibel error to test empirical flow laws. Depending on the level of experimental error, the alphabetic evaluation of decibel error is defined. Application of flow law based on energy dissipation theory to five multiphase rocks shows good consistence with experimental data.

  11. Vervet monkeys solve a multiplayer "forbidden circle game" by queuing to learn restraint.

    PubMed

    Fruteau, Cécile; van Damme, Eric; Noë, Ronald

    2013-04-22

    In social dilemmas, the ability of individuals to coordinate their actions is crucial to reach group optima. Unless exacted by power or force, coordination in humans relies on a common understanding of the problem, which is greatly facilitated by communication. The lack of means of consultation about the nature of the problem and how to solve it may explain why multiagent coordination in nonhuman vertebrates has commonly been observed only when multiple individuals react instantaneously to a single stimulus, either natural or experimentally simulated, for example a predator, a prey, or a neighboring group. Here we report how vervet monkeys solved an experimentally induced coordination problem. In each of three groups, we trained a low-ranking female, the "provider," to open a container holding a large amount of food, which the providers only opened when all individuals dominant to them ("dominants") stayed outside an imaginary "forbidden circle" around it. Without any human guidance, the dominants learned restraint one by one, in hierarchical order from high to low. Once all dominants showed restraint immediately at the start of the trial, the providers opened the container almost instantly, saving all individuals opportunity costs due to lost foraging time. Solving this game required trial-and-error learning based on individual feedback from the provider to each dominant, and all dominants being patient enough to wait outside the circle while others learned restraint. Communication, social learning, and policing by high-ranking animals played no perceptible role. PMID:23541727

  12. A queuing model for designing multi-modality buried target detection systems: preliminary results

    NASA Astrophysics Data System (ADS)

    Malof, Jordan M.; Morton, Kenneth D.; Collins, Leslie M.; Torrione, Peter A.

    2015-05-01

    Many remote sensing modalities have been developed for buried target detection, each one offering its own relative advantages over the others. As a result there has been interest in combining several modalities into a single detection platform that benefits from the advantages of each constituent sensor, without suffering from their weaknesses. Traditionally this involves collecting data continuously on all sensors and then performing data, feature, or decision level fusion. While this is effective for lowering false alarm rates, this strategy neglects the potential benefits of a more general system-level fusion architecture. Such an architecture can involve dynamically changing which modalities are in operation. For example, a large standoff modality such as a forward-looking infrared (FLIR) camera can be employed until an alarm is encountered, at which point a high performance (but short standoff) sensor, such as ground penetrating radar (GPR), is employed. Because the system is dynamically changing its rate of advance and sensors, it becomes difficult to evaluate the expected false alarm rate and advance rate. In this work, a probabilistic model is proposed that can be used to estimate these quantities based on a provided operating policy. In this model the system consists of a set of states (e.g., sensors employed) and conditions encountered (e.g., alarm locations). The predictive accuracy of the model is evaluated using a collection of collocated FLIR and GPR data and the results indicate that the model is effective at predicting the desired system metrics.

  13. Learning control system design based on 2-D theory - An application to parallel link manipulator

    NASA Technical Reports Server (NTRS)

    Geng, Z.; Carroll, R. L.; Lee, J. D.; Haynes, L. H.

    1990-01-01

    An approach to iterative learning control system design based on two-dimensional system theory is presented. A two-dimensional model for the iterative learning control system which reveals the connections between learning control systems and two-dimensional system theory is established. A learning control algorithm is proposed, and the convergence of learning using this algorithm is guaranteed by two-dimensional stability. The learning algorithm is applied successfully to the trajectory tracking control problem for a parallel link robot manipulator. The excellent performance of this learning algorithm is demonstrated by the computer simulation results.

  14. Improved method for calculating strain energy release rate based on beam theory

    NASA Technical Reports Server (NTRS)

    Sun, C. T.; Pandey, R. K.

    1994-01-01

    The Timoshenko beam theory was used to model cracked beams and to calculate the total strain-energy release rate. The root rotations of the beam segments at the crack tip were estimated based on an approximate two-dimensional elasticity solution. By including the strain energy released due to the root rotations of the beams during crack extension, the strain-energy release rate obtained using beam theory agrees very well with the two-dimensional finite element solution. Numerical examples were given for various beam geometries and loading conditions. Comparisons with existing beam models were also given.

  15. A method for calculating strain energy release rate based on beam theory

    NASA Technical Reports Server (NTRS)

    Sun, C. T.; Pandey, R. K.

    1993-01-01

    The Timoshenko beam theory was used to model cracked beams and to calculate the total strain energy release rate. The root rotation of the beam segments at the crack tip were estimated based on an approximate 2D elasticity solution. By including the strain energy released due to the root rotations of the beams during crack extension, the strain energy release rate obtained using beam theory agrees very well with the 2D finite element solution. Numerical examples were given for various beam geometries and loading conditions. Comparisons with existing beam models were also given.

  16. A novel trust evaluation method for Ubiquitous Healthcare based on cloud computational theory.

    PubMed

    Athanasiou, Georgia; Fengou, Maria-Anna; Beis, Antonios; Lymberopoulos, Dimitrios

    2014-01-01

    The notion of trust is considered to be the cornerstone on patient-psychiatrist relationship. Thus, a trustfully background is fundamental requirement for provision of effective Ubiquitous Healthcare (UH) service. In this paper, the issue of Trust Evaluation of UH Providers when register UH environment is addressed. For that purpose a novel trust evaluation method is proposed, based on cloud theory, exploiting User Profile attributes. This theory mimics human thinking, regarding trust evaluation and captures fuzziness and randomness of this uncertain reasoning. Two case studies are investigated through simulation in MATLAB software, in order to verify the effectiveness of this novel method. PMID:25570992

  17. An electrification mechanism of sand grains based on the diffuse double layer and Hertz contact theory

    NASA Astrophysics Data System (ADS)

    Xie, Li; Han, Kui; Ma, Yanping; Zhou, Jùn

    2013-09-01

    The electrification of sand grains lifting off from sand bed is investigated experimentally. It was found that sand grains were able to carry charges, which is comparable in magnitude with the experimental results and is related to grain sizes, pH of soil, relative humidity, and electric field. Based on the theory of diffuse double layer (DDL) and Hertz contact theory, an electrification mechanism due to the break of DDLs of sand grains is presented and a formula which takes environmental conditions and grain parameters into consideration is obtained to calculate the charge-mass ratio of lift-off sand grains.

  18. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs. PMID:26062288

  19. A comparison of design variables for control theory based airfoil optimization

    NASA Technical Reports Server (NTRS)

    Reuther, James; Jameson, Antony

    1995-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work in the area it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using either the potential flow or the Euler equations with either a conformal mapping or a general coordinate system. We have also explored three-dimensional extensions of these formulations recently. The goal of our present work is to demonstrate the versatility of the control theory approach by designing airfoils using both Hicks-Henne functions and B-spline control points as design variables. The research also demonstrates that the parameterization of the design space is an open question in aerodynamic design.

  20. Robust stabilization control based on guardian maps theory for a longitudinal model of hypersonic vehicle.

    PubMed

    Liu, Yanbin; Liu, Mengying; Sun, Peihua

    2014-01-01

    A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods. PMID:24795535

  1. Classification of PolSAR image based on quotient space theory

    NASA Astrophysics Data System (ADS)

    An, Zhihui; Yu, Jie; Liu, Xiaomeng; Liu, Limin; Jiao, Shuai; Zhu, Teng; Wang, Shaohua

    2015-12-01

    In order to improve the classification accuracy, quotient space theory was applied in the classification of polarimetric SAR (PolSAR) image. Firstly, Yamaguchi decomposition method is adopted, which can get the polarimetric characteristic of the image. At the same time, Gray level Co-occurrence Matrix (GLCM) and Gabor wavelet are used to get texture feature, respectively. Secondly, combined with texture feature and polarimetric characteristic, Support Vector Machine (SVM) classifier is used for initial classification to establish different granularity spaces. Finally, according to the quotient space granularity synthetic theory, we merge and reason the different quotient spaces to get the comprehensive classification result. Method proposed in this paper is tested with L-band AIRSAR of San Francisco bay. The result shows that the comprehensive classification result based on the theory of quotient space is superior to the classification result of single granularity space.

  2. An automated integration-free path-integral method based on Kleinert's variational perturbation theory

    NASA Astrophysics Data System (ADS)

    Wong, Kin-Yiu; Gao, Jiali

    2007-12-01

    Based on Kleinert's variational perturbation (KP) theory [Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets, 3rd ed. (World Scientific, Singapore, 2004)], we present an analytic path-integral approach for computing the effective centroid potential. The approach enables the KP theory to be applied to any realistic systems beyond the first-order perturbation (i.e., the original Feynman-Kleinert [Phys. Rev. A 34, 5080 (1986)] variational method). Accurate values are obtained for several systems in which exact quantum results are known. Furthermore, the computed kinetic isotope effects for a series of proton transfer reactions, in which the potential energy surfaces are evaluated by density-functional theory, are in good accordance with experiments. We hope that our method could be used by non-path-integral experts or experimentalists as a "black box" for any given system.

  3. Experimental investigation and kinetic-theory-based model of a rapid granular shear flow

    NASA Astrophysics Data System (ADS)

    Wildman, R. D.; Martin, T. W.; Huntley, J. M.; Jenkins, J. T.; Viswanathan, H.; Fen, X.; Parker, D. J.

    An experimental investigation of an idealized rapidly sheared granular flow was performed to test the predictions of a model based on the kinetic theory of dry granular media. Glass ballotini beads were placed in an annular shear cell and the lower boundary rotated to induce a shearing motion in the bed. A single particle was tracked using the positron emission particle tracking (PEPT) technique, a method that determines the location of a particle through the triangulation of gamma photons emitted by a radioactive tracer particle. The packing fraction and velocity fields within the three-dimensional flow were measured and compared to the predictions of a model developed using the conservation and balance equations applicable to dissipative systems, and solved incorporating constitutive relations derived from kinetic theory. The comparison showed that kinetic theory is able to capture the general features of a rapid shear flow reasonably well over a wide range of shear rates and confining pressures.

  4. Robust Stabilization Control Based on Guardian Maps Theory for a Longitudinal Model of Hypersonic Vehicle

    PubMed Central

    Liu, Mengying; Sun, Peihua

    2014-01-01

    A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods. PMID:24795535

  5. Theory of Infrared Hall Conductivity Based on the Fermi Liquid Theory: Analysis of High-Tc Superconductors

    NASA Astrophysics Data System (ADS)

    Kontani, Hiroshi

    2007-07-01

    We study optical Hall conductivity for high-Tc superconductors on the basis of microscopic Fermi liquid theory. Current vertex corrections (CVCs) are correctly taken into account to satisfy the conservation laws, and are performed for the first time for optical conductivities based on the fluctuation-exchange (FLEX) approximation. We find that (I) the CVC emphasizes the ?-dependence of ?xy(?) significantly when the antiferromagnetic (AF) fluctuations are strong. For this reason, the relation ?xy(?)˜\\{?xx(?)\\}2, which is satisfied in the extended Drude model given by the relaxation time approximation (RTA), is completely violated for a wide range of frequencies. Consequently, (II) the optical Hall coefficient RH(?) strongly depends on ? below infrared frequencies, which is consistent with experimental observations. Interestingly, although ?xx(?) does not follow a simple Drude form due to the ?-dependence of the relaxation time ?, ?H(?) and ?xy(?) follow approximate simple Drude forms since the ?-dependences of ? and CVC almost cancel each other out. In conclusion, anomalous optical transport phenomena in high-Tc superconductors, which had been frequently assumed to be as evidence of the breakdown of the Fermi liquid state, can be well explained in terms of the nearly AF Fermi liquid once the CVC is taken into account.

  6. Development and evaluation of a theory-based physical activity guidebook for breast cancer survivors.

    PubMed

    Vallance, Jeffrey K; Courneya, Kerry S; Taylor, Lorian M; Plotnikoff, Ronald C; Mackey, John R

    2008-04-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert judges completed the Maine Area Health Education Center's 18-item attribute checklist for evaluating written health information. Judges indicated that the PA guidebook achieved desirable attributes for the suitability and appropriateness of the guidebook. A subset of TPB expert judges completed items designed to determine the degree of match between the guidebook content and the respective TPB components. Mean item-content relevance ratings indicated at least a "very good match" between the PA guidebook content and the keyed TPB domains. Theoretically based PA information may be an effective strategy for increasing PA in breast cancer survivors at the population level. PMID:16861593

  7. Evaluating clinical simulations for learning procedural skills: a theory-based approach.

    PubMed

    Kneebone, Roger

    2005-06-01

    Simulation-based learning is becoming widely established within medical education. It offers obvious benefits to novices learning invasive procedural skills, especially in a climate of decreasing clinical exposure. However, simulations are often accepted uncritically, with undue emphasis being placed on technological sophistication at the expense of theory-based design. The author proposes four key areas that underpin simulation-based learning, and summarizes the theoretical grounding for each. These are (1) gaining technical proficiency (psychomotor skills and learning theory, the importance of repeated practice and regular reinforcement), (2) the place of expert assistance (a Vygotskian interpretation of tutor support, where assistance is tailored to each learner's needs), (3) learning within a professional context (situated learning and contemporary apprenticeship theory), and (4) the affective component of learning (the effect of emotion on learning). The author then offers four criteria for critically evaluating new or existing simulations, based on the theoretical framework outlined above. These are: (1) Simulations should allow for sustained, deliberate practice within a safe environment, ensuring that recently-acquired skills are consolidated within a defined curriculum which assures regular reinforcement; (2) simulations should provide access to expert tutors when appropriate, ensuring that such support fades when no longer needed; (3) simulations should map onto real-life clinical experience, ensuring that learning supports the experience gained within communities of actual practice; and (4) simulation-based learning environments should provide a supportive, motivational, and learner-centered milieu which is conducive to learning. PMID:15917357

  8. Buckling analysis of functionally graded nanobeams based on a nonlocal third-order shear deformation theory

    NASA Astrophysics Data System (ADS)

    Rahmani, O.; Jandaghian, A. A.

    2015-06-01

    In this paper, a general third-order beam theory that accounts for nanostructure-dependent size effects and two-constituent material variation through the nanobeam thickness, i.e., functionally graded material (FGM) beam is presented. The material properties of FG nanobeams are assumed to vary through the thickness according to the power law. A detailed derivation of the equations of motion based on Eringen nonlocal theory using Hamilton's principle is presented, and a closed-form solution is derived for buckling behavior of the new model with various boundary conditions. The nonlocal elasticity theory includes a material length scale parameter that can capture the size effect in a functionally graded material. The proposed model is efficient in predicting the shear effect in FG nanobeams by applying third-order shear deformation theory. The proposed approach is validated by comparing the obtained results with benchmark results available in the literature. In the following, a parametric study is conducted to investigate the influences of the length scale parameter, gradient index, and length-to-thickness ratio on the buckling of FG nanobeams and the improvement on nonlocal third-order shear deformation theory comparing with the classical (local) beam model has been shown. It is found out that length scale parameter is crucial in studying the stability behavior of the nanobeams.

  9. Personality and Psychopathology: a Theory-Based Revision of Eysenck’s PEN Model

    PubMed Central

    van Kampen, Dirk

    2009-01-01

    The principal aim of this paper is to investigate whether it is possible to create a personality taxonomy of clinical relevance out of Eysenck’s original PEN model by repairing the various shortcomings that can be noted in Eysenck’s personality theory, particularly in relation to P or Psychoticism. Addressing three approaches that have been followed to answer the question ‘which personality factors are basic?’, arguments are listed to show that particularly the theory-informed approach, originally defended by Eysenck, may lead to scientific progress. However, also noting the many deficiencies in the nomological network surrounding P, the peculiar situation arises that we adhere to Eysenck’s theory-informed methodology, but criticize his theory. These arguments and criticisms led to the replacement of P by three orthogonal and theory-based factors, Insensitivity (S), Orderliness (G), and Absorption (A), that together with the dimensions E or Extraversion and N or Neuroticism, that were retained from Eysenck’s PEN model, appear to give a comprehensive account of the main vulnerability factors in schizophrenia and affective disorders, as well as in other psychopathological conditions. PMID:20498694

  10. Efficacy of theory-based interventions to promote physical activity. A meta-analysis of randomised controlled trials.

    PubMed

    Gourlan, M; Bernard, P; Bortolon, C; Romain, A J; Lareyre, O; Carayol, M; Ninot, G; Boiché, J

    2016-03-01

    Implementing theory-based interventions is an effective way to influence physical activity (PA) behaviour in the population. This meta-analysis aimed to (1) determine the global effect of theory-based randomised controlled trials dedicated to the promotion of PA among adults, (2) measure the actual efficacy of interventions against their theoretical objectives and (3) compare the efficacy of single- versus combined-theory interventions. A systematic search through databases and review articles was carried out. Our results show that theory-based interventions (k = 82) significantly impact the PA behaviour of participants (d = 0.31, 95% CI [0.24, 0.37]). While moderation analyses revealed no efficacy difference between theories, interventions based on a single theory (d = 0.35; 95% CI [0.26, 0.43]) reported a higher impact on PA behaviour than those based on a combination of theories (d = 0.21; 95% CI [0.11, 0.32]). In spite of the global positive effect of theory-based interventions on PA behaviour, further research is required to better identify the specificities, overlaps or complementarities of the components of interventions based on relevant theories. PMID:25402606

  11. Asymmetric Invisibility Cloaking Theory Based on the Concept of Effective Electromagnetic Fields for Photons

    NASA Astrophysics Data System (ADS)

    Amemiya, Tomo; Taki, Masato; Kanazawa, Toru; Arai, Shigehisa

    2014-03-01

    The asymmetric invisibility cloak is a special cloak with unidirectional transparency; that is, a person in the cloak should not be seen from the outside but should be able to see the outside. Existing theories of designing invisibility cloaks cannot be used for asymmetric cloaking because they are based on the transformation optics that uses Riemannian metric tensor independent of direction. To overcome this problem, we propose introducing directionality into invisibility cloaking. Our theory is based on ``the theory of effective magnetic field for photons'' proposed by Stanford University.[2] To realize asymmetric cloaking, we have extended the Stanford's theory to add the concept of ``effective electric field for photons.'' The effective electric and the magnetic field can be generated using a photonc resonator lattice, which is a kind of metamaterial. The Hamiltonian for photons in these fields has a similar form to that of the Hamiltonian for a charged particle in an electromagnetic field. An incident photon therefore experiences a ``Lorentz-like'' and a ``Coulomb-like'' force and shows asymmetric movement depending of its travelling direction.We show the procedure of designing actual invisibility cloaks using the photonc resonator lattice and confirm their operation with the aid of computer simulation. This work was supported in part by the MEXT; JSPS KAKENHI Grant Numbers #24246061, #24656046, #25420321, #25420322.

  12. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission.

    PubMed

    Kiani, Mehdi; Ghovanloo, Maysam

    2012-09-01

    Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368

  13. Applying trait-based models to achieve functional targets for theory-driven ecological restoration.

    PubMed

    Laughlin, Daniel C

    2014-07-01

    Manipulating community assemblages to achieve functional targets is a key component of restoring degraded ecosystems. The response-and-effect trait framework provides a conceptual foundation for translating restoration goals into functional trait targets, but a quantitative framework has been lacking for translating trait targets into assemblages of species that practitioners can actually manipulate. This study describes new trait-based models that can be used to generate ranges of species abundances to test theories about which traits, which trait values and which species assemblages are most effective for achieving functional outcomes. These models are generalisable, flexible tools that can be widely applied across many terrestrial ecosystems. Examples illustrate how the framework generates assemblages of indigenous species to (1) achieve desired community responses by applying the theories of environmental filtering, limiting similarity and competitive hierarchies, or (2) achieve desired effects on ecosystem functions by applying the theories of mass ratios and niche complementarity. Experimental applications of this framework will advance our understanding of how to set functional trait targets to achieve the desired restoration goals. A trait-based framework provides restoration ecology with a robust scaffold on which to apply fundamental ecological theory to maintain resilient and functioning ecosystems in a rapidly changing world. PMID:24766299

  14. Torsional instability of carbon nano-peapods based on the nonlocal elastic shell theory

    NASA Astrophysics Data System (ADS)

    Asghari, M.; Rafati, J.; Naghdabadi, R.

    2013-01-01

    In this paper a shell formulation is proposed for analyzing the torsional instability of carbon nano-peapods (CNPs), i.e., the hybrid structures composed of C60 fullerenes encapsulated inside carbon nanotubes (CNTs), based on the nonlocal elasticity theory. The nonlocal elasticity theory, as a well-known non-classical continuum theory, is capable to capture small scale effects which appear due to the discontinuities in nano-structures. Based on the derived formulation, the critical torsional moments for a pristine (10,10) CNT and C60@(10,10) CNP are investigated as case studies. The results for the (10,10) CNT are compared with those of the available molecular dynamics simulations in the literature, and accordingly the appropriate value of the small scale coefficient appearing in the constitutive equations of the nonlocal theory is estimated for CNTs. Then, the critical torsional moment for the C60@(10,10) CNP is predicted. It is observed that the presence of the encapsulated C60 fullerenes inside the (10,10) CNT causes an increase in the torsional instability resistance of the CNT more than 100%.

  15. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission

    PubMed Central

    Kiani, Mehdi; Ghovanloo, Maysam

    2014-01-01

    Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368

  16. Are node-based and stem-based clades equivalent? Insights from graph theory.

    PubMed

    Martin, Jeremy; Blackburn, David; Wiley, E O

    2010-01-01

    Despite the prominence of "tree-thinking" among contemporary systematists and evolutionary biologists, the biological meaning of different mathematical representations of phylogenies may still be muddled. We compare two basic kinds of discrete mathematical models used to portray phylogenetic relationships among species and higher taxa: stem-based trees and node-based trees. Each model is a tree in the sense that is commonly used in mathematics; the difference between them lies in the biological interpretation of their vertices and edges. Stem-based and node-based trees carry exactly the same information and the biological interpretation of each is similar. Translation between these two kinds of trees can be accomplished by a simple algorithm, which we provide. With the mathematical representation of stem-based and node-based trees clarified, we argue for a distinction between types of trees and types of names. Node-based and stem-based trees contain exactly the same information for naming clades. However, evolutionary concepts, such as monophyly, are represented as different mathematical substructures in the two models. For a given stem-based tree, one should employ stem-based names, whereas for a given node-based tree, one should use node-based names, but applying a node-based name to a stem-based tree is not logical because node-based names cannot exist on a stem-based tree and visa versa. Authors might use node-based and stem-based concepts of monophyly for the same representation of a phylogeny, yet, if so, they must recognize that such a representation differs from the graphical models used for computing in phylogenetic systematics. PMID:21113336

  17. Membrane-Based Characterization of a Gas Component — A Transient Sensor Theory

    PubMed Central

    Lazik, Detlef

    2014-01-01

    Based on a multi-gas solution-diffusion problem for a dense symmetrical membrane this paper presents a transient theory of a planar, membrane-based sensor cell for measuring gas from both initial conditions: dynamic and thermodynamic equilibrium. Using this theory, the ranges for which previously developed, simpler approaches are valid will be discussed; these approaches are of vital interest for membrane-based gas sensor applications. Finally, a new theoretical approach is introduced to identify varying gas components by arranging sensor cell pairs resulting in a concentration independent gas-specific critical time. Literature data for the N2, O2, Ar, CH4, CO2, H2 and C4H10 diffusion coefficients and solubilities for a polydimethylsiloxane membrane were used to simulate gas specific sensor responses. The results demonstrate the influence of (i) the operational mode; (ii) sensor geometry and (iii) gas matrices (air, Ar) on that critical time. Based on the developed theory the case-specific suitable membrane materials can be determined and both operation and design options for these sensors can be optimized for individual applications. The results of mixing experiments for different gases (O2, CO2) in a gas matrix of air confirmed the theoretical predictions. PMID:24608004

  18. In-medium ?' mass and ?'N interaction based on chiral effective theory

    NASA Astrophysics Data System (ADS)

    Sakai, Shuntaro; Jido, Daisuke

    2013-12-01

    The in-medium ?' mass and the ?'N interaction are investigated in an effective theory based on the linear realization of the SU(3) chiral symmetry. We find that a large part of the ?' mass is generated by the spontaneous breaking of chiral symmetry through the UA(1) anomaly. As a consequence of this observation, the ?' mass is reduced in nuclear matter where chiral symmetry is partially restored. In our model, the mass reduction is found to be 80 MeV at the saturation density. Estimating the ?'N interaction based on the same effective theory, we find that the ?'N interaction in the scalar channel is attractive sufficiently to form a bound state in the ?'N system with a several MeV binding energy. We discuss the origin of attraction by emphasizing the special role of the ? meson in the linear sigma model for the mass generation of ?' and N.

  19. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    PubMed

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support. PMID:25778674

  20. [Research on ECG signal analysis based on the cloudy model theory].

    PubMed

    Li, Xin; Hong, Wenxue; Wang, Xiuqing; Wang, Huini

    2011-02-01

    The characteristics of electrocardiogram (ECG) signal are fuzzy and random, so that they are difficult for automatic analysis and diagnosis. To solve this problem, an uncertainties transformation model-Cloud Model, which is a fusion of qualitative and quantitative information, was tried to use to analyze the ECG signal. The model fusions the characters of fuzzy and random, just suit to the ECG automatic analysis and diagnosis system. Based on the theory of the cloudy transform and comprehensive cloud, the clustering of ECG signal was finished. Further more, the clinic experience of expert was summarized as classification rules based on the theory. The experiment data were from MIT/ BIH database. The experiment results showed more close to those of the expert's analysis. The describing result was more close to those of the more expert's with qualitative and quantitative information. It is well concluded that the method is an effective ECG signal analysis method. PMID:21485177

  1. Distinguishing ability analysis of compressed sensing radar imaging based on information theory model

    NASA Astrophysics Data System (ADS)

    Jiang, Hai; Zhang, Bingchen; Lin, Yueguan; Hong, Wen; Wu, Yirong

    2011-11-01

    Recent theory of compressed sensing (CS) has been widely used in many application areas. In this paper, we mainly concentrate on the CS in radar and analyze the distinguishing ability of CS radar image based on information theory model. The information content contained in the CS radar echoes is analyzed by simplifying the information transmission channel as a parallel Gaussian channel, and the relationship among the signal-to-noise ratio (SNR) of the echo signal, the number of required samples, the length of the sparse targets and the distinguishing level of the radar image is gotten. Based on this result, we introduced the distinguishing ability of the CS radar image and some of its properties are also gotten. Real IECAS advanced scanning two-dimensional railway observation (ASTRO) data experiment demonstrates our conclusions.

  2. Modified polarized geometrical attenuation model for bidirectional reflection distribution function based on random surface microfacet theory.

    PubMed

    Liu, Hong; Zhu, Jingping; Wang, Kai

    2015-08-24

    The geometrical attenuation model given by Blinn was widely used in the geometrical optics bidirectional reflectance distribution function (BRDF) models. Blinn's geometrical attenuation model based on symmetrical V-groove assumption and ray scalar theory causes obvious inaccuracies in BRDF curves and negatives the effects of polarization. Aiming at these questions, a modified polarized geometrical attenuation model based on random surface microfacet theory is presented by combining of masking and shadowing effects and polarized effect. The p-polarized, s-polarized and unpolarized geometrical attenuation functions are given in their separate expressions and are validated with experimental data of two samples. It shows that the modified polarized geometrical attenuation function reaches better physical rationality, improves the precision of BRDF model, and widens the applications for different polarization. PMID:26368247

  3. Theory-based evaluation of a comprehensive Latino education initiative: an interactive evaluation approach.

    PubMed

    Nesman, Teresa M; Batsche, Catherine; Hernandez, Mario

    2007-08-01

    Latino student access to higher education has received significant national attention in recent years. This article describes a theory-based evaluation approach used with ENLACE of Hillsborough, a 5-year project funded by the W.K. Kellogg Foundation for the purpose of increasing Latino student graduation from high school and college. Theory-based evaluation guided planning, implementation as well as evaluation through the process of developing consensus on the Latino population of focus, adoption of culturally appropriate principles and values to guide the project, and identification of strategies to reach, engage, and impact outcomes for Latino students and their families. The approach included interactive development of logic models that focused the scope of interventions and guided evaluation designs for addressing three stages of the initiative. Challenges and opportunities created by the approach are discussed, as well as ways in which the initiative impacted Latino students and collaborating educational institutions. PMID:17689332

  4. Design of Flexure-based Precision Transmission Mechanisms using Screw Theory

    SciTech Connect

    Hopkins, J B; Panas, R M

    2011-02-07

    This paper enables the synthesis of flexure-based transmission mechanisms that possess multiple decoupled inputs and outputs of any type (e.g. rotations, translations, and/or screw motions), which are linked by designer-specified transmission ratios. A comprehensive library of geometric shapes is utilized from which every feasible concept that possesses the desired transmission characteristics may be rapidly conceptualized and compared before an optimal concept is selected. These geometric shapes represent the rigorous mathematics of screw theory and uniquely link a body's desired motions to the flexible constraints that enable those motions. This paper's impact is most significant to the design of nano-positioners, microscopy stages, optical mounts, and sensors. A flexure-based microscopy stage was designed, fabricated, and tested to demonstrate the utility of the theory.

  5. Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping

    PubMed Central

    Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E

    2014-01-01

    Fear arousal—vividly showing people the negative health consequences of life-endangering behaviors—is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. PMID:24811880

  6. A theory-based logic model for innovation policy and evaluation.

    SciTech Connect

    Jordan, Gretchen B.

    2010-04-01

    Current policy and program rationale, objectives, and evaluation use a fragmented picture of the innovation process. This presents a challenge since in the United States officials in both the executive and legislative branches of government see innovation, whether that be new products or processes or business models, as the solution to many of the problems the country faces. The logic model is a popular tool for developing and describing the rationale for a policy or program and its context. This article sets out to describe generic logic models of both the R&D process and the diffusion process, building on existing theory-based frameworks. Then a combined, theory-based logic model for the innovation process is presented. Examples of the elements of the logic, each a possible leverage point or intervention, are provided, along with a discussion of how this comprehensive but simple model might be useful for both evaluation and policy development.

  7. Eight myths on motivating social services workers: theory-based perspectives.

    PubMed

    Latting, J K

    1991-01-01

    A combination of factors has made formal motivational and reward systems rare in human service organizations generally and virtually non-existent in social service agencies. The author reviews eight of these myths by reference to eight motivational theories which refute them: need theory, expectancy theory, feedback theory, equity theory, reinforcement theory, cognitive evaluation theory, goal setting theory, and social influence theory. Although most of these theories have been developed and applied in the private sector, relevant research has also been conducted in social service agencies. The author concludes with a summary of guidelines suggested by the eight theories for motivating human service workers. PMID:10114292

  8. Design of “Magnetic Resonance Type” WPT Systems Based on Filter Theory

    NASA Astrophysics Data System (ADS)

    Awai, Ikuo; Ishizaki, Toshio

    A wireless power transfer system that is made of magnetically coupled resonators is designed for 0Ω power source. The design is based on the BPF theory, which insists more restrictive condition than the conventional power factor condition. But it gives wider operating bandwidth and higher power transfer efficiency. The present paper derives the condition analytically and shows some useful design examples, including 2-stage as wall as 3-stage system, and a system with a loop coil to transform the load impedance.

  9. Equilibrium theory-based design of simulated moving bed processes under reduced purity requirements linear isotherms.

    PubMed

    Rajendran, Arvind

    2008-03-28

    The design of simulated moving bed processes under reduced purity requirements for systems whose isotherm is linear is considered. Based on the equilibrium theory of chromatography, explicit equations to uniquely identify the separation region that will ensure specified extract and raffinate purities are derived. The identification of the region requires only the knowledge of Henry constants of the solutes, the concentration of the solutes in the feed and the purity specifications. These results are validated using numerical simulations. PMID:18281052

  10. Microvibration Attenuation based on H?/LPV Theory for High Stability Space Missions

    NASA Astrophysics Data System (ADS)

    Preda, Valentin; Cieslak, Jerome; Henry, David; Bennani, Samir; Falcoz, Alexandre

    2015-11-01

    This paper presents a LPV (Linear Parameter Varying) solution for a mixed passive-active architecture used to mitigate the microvibrations generated by reaction wheels in satellites. In particular, H?/LPV theory is used to mitigate low frequency disturbances, current baseline for high frequency microvibration mitigation being based on elastomer materials. The issue of multiple harmonic microvibrations is also investigated. Simulation results from a test benchmark provided by Airbus Defence and Space demonstrate the potential of the proposed method.

  11. Applying Educational Theory to Simulation-Based Training and Assessment in Surgery.

    PubMed

    Chauvin, Sheila W

    2015-08-01

    Considerable progress has been made regarding the range of simulator technologies and simulation formats. Similarly, results from research in human learning and behavior have facilitated the development of best practices in simulation-based training (SBT) and surgical education. Today, SBT is a common curriculum component in surgical education that can significantly complement clinical learning, performance, and patient care experiences. Beginning with important considerations for selecting appropriate forms of simulation, several relevant educational theories of learning are described. PMID:26210964

  12. Ultrasonic Field Computation into Multilayered Composite Materials Using a Homogenization Method Based on Ray Theory

    NASA Astrophysics Data System (ADS)

    Deydier, S.; Gengembre, N.; Calmon, P.; Mengeling, V.; Pétillon, O.

    2005-04-01

    The simulation of ultrasonic NDT of carbon-fiber-reinforced epoxy composites (CFRP) is an important challenge for the aircraft industry. In a previous article, we proposed to evaluate the field radiated into such components by means of a homogenization method coupled to the pencil model implemented in CIVA software. Following the same goals, an improvement is proposed here through the development of an original homogenization procedure based on ray theory.

  13. Predicting magnetorheological fluid flow behavior using a multiscale kinetic theory-based model

    NASA Astrophysics Data System (ADS)

    Mahboob, Monon; Ahmadkhanlou, Farzad; Kagarise, Christopher; Washington, Gregory; Bechtel, Stephen; Koelling, Kurt

    2009-03-01

    Magnetorheological (MR) fluids have rheological properties, such as the viscosity and yield stress that can be altered by an external magnetic field. The design of novel devices utilizing the MR fluid behavior in multi-degree of freedom applications require three dimensional models characterizing the coupling of magnetic behavior to mechanical behavior in MR fluids. A 3-D MR fluid model based on multiscale kinetic theory is presented. The kinetic theory-based model relates macroscale MR fluid behavior to a first-principle description of magnetomechanical coupling at the microscale. A constitutive relation is also proposed that accounts for the various forces transmitted through the fluid. This model accounts for the viscous drag on the spherical particles as well as Brownian forces. Interparticle forces due to magnetization and external magnetic forces applied to ferrous particles are considered. The tunable rheological properties of the MR fluids are studied using a MR rheological instrument. High and low viscosity carrier fluids along with small and large carbonyl iron particles are used to make and study the behavior of four different MR fluids. Experiments measuring steady, and dynamic oscillatory shear response under a range of magnetic field strengths are performed. The rheological properties of the MR fluid samples are investigated and compared to the proposed kinetic theory-based model. The storage (G') and loss (G") moduli of the MR fluids are studied as well.

  14. Three new branched chain equations of state based on Wertheim's perturbation theory

    NASA Astrophysics Data System (ADS)

    Marshall, Bennett D.; Chapman, Walter G.

    2013-05-01

    In this work, we present three new branched chain equations of state (EOS) based on Wertheim's perturbation theory. The first represents a slightly approximate general branched chain solution of Wertheim's second order perturbation theory (TPT2) for athermal hard chains, and the second represents the extension of first order perturbation theory with a dimer reference fluid (TPT1-D) to branched athermal hard chain molecules. Each athermal branched chain EOS was shown to give improved results over their linear counterparts when compared to simulation data for branched chain molecules with the branched TPT1-D EOS being the most accurate. Further, it is shown that the branched TPT1-D EOS can be extended to a Lennard-Jones dimer reference system to obtain an equation of state for branched Lennard-Jones chains. The theory is shown to accurately predict the change in phase diagram and vapor pressure which results from branching as compared to experimental data for n-octane and corresponding branched isomers.

  15. Energy decomposition analysis based on a block-localized wavefunction and multistate density functional theory

    PubMed Central

    Bao, Peng

    2013-01-01

    An interaction energy decomposition analysis method based on the block-localized wavefunction (BLW-ED) approach is described. The first main feature of the BLW-ED method is that it combines concepts of valence bond and molecular orbital theories such that the intermediate and physically intuitive electron-localized states are variationally optimized by self-consistent field calculations. Furthermore, the block-localization scheme can be used both in wave function theory and in density functional theory, providing a useful tool to gain insights on intermolecular interactions that would otherwise be difficult to obtain using the delocalized Kohn–Sham DFT. These features allow broad applications of the BLW method to energy decomposition (BLW-ED) analysis for intermolecular interactions. In this perspective, we outline theoretical aspects of the BLW-ED method, and illustrate its applications in hydrogen-bonding and ?–cation intermolecular interactions as well as metal–carbonyl complexes. Future prospects on the development of a multistate density functional theory (MSDFT) are presented, making use of block-localized electronic states as the basis configurations. PMID:21369567

  16. Theory of plasma contactors in ground-based experiments and low Earth orbit

    NASA Technical Reports Server (NTRS)

    Gerver, M. J.; Hastings, Daniel E.; Oberhardt, M. R.

    1990-01-01

    Previous theoretical work on plasma contactors as current collectors has fallen into two categories: collisionless double layer theory (describing space charge limited contactor clouds) and collisional quasineutral theory. Ground based experiments at low current are well explained by double layer theory, but this theory does not scale well to power generation by electrodynamic tethers in space, since very high anode potentials are needed to draw a substantial ambient electron current across the magnetic field in the absence of collisions (or effective collisions due to turbulence). Isotropic quasineutral models of contactor clouds, extending over a region where the effective collision frequency upsilon sub e exceeds the electron cyclotron frequency omega sub ce, have low anode potentials, but would collect very little ambient electron current, much less than the emitted ion current. A new model is presented, for an anisotropic contactor cloud oriented along the magnetic field, with upsilon sub e less than omega sub ce. The electron motion along the magnetic field is nearly collisionless, forming double layers in that direction, while across the magnetic field the electrons diffuse collisionally and the potential profile is determined by quasineutrality. Using a simplified expression for upsilon sub e due to ion acoustic turbulence, an analytic solution has been found for this model, which should be applicable to current collection in space. The anode potential is low and the collected ambient electron current can be several times the emitted ion current.

  17. Adapting SAFT-? perturbation theory to site-based molecular dynamics simulation. I. Homogeneous fluids.

    PubMed

    Ghobadi, Ahmadreza F; Elliott, J Richard

    2013-12-21

    In this work, we aim to develop a version of the Statistical Associating Fluid Theory (SAFT)-? equation of state (EOS) that is compatible with united-atom force fields, rather than experimental data. We rely on the accuracy of the force fields to provide the relation to experimental data. Although, our objective is a transferable theory of interfacial properties for soft and fused heteronuclear chains, we first clarify the details of the SAFT-? approach in terms of site-based simulations for homogeneous fluids. We show that a direct comparison of Helmholtz free energy to molecular simulation, in the framework of a third order Weeks-Chandler-Andersen perturbation theory, leads to an EOS that takes force field parameters as input and reproduces simulation results for Vapor-Liquid Equilibria (VLE) calculations. For example, saturated liquid density and vapor pressure of n-alkanes ranging from methane to dodecane deviate from those of the Transferable Potential for Phase Equilibria (TraPPE) force field by about 0.8% and 4%, respectively. Similar agreement between simulation and theory is obtained for critical properties and second virial coefficient. The EOS also reproduces simulation data of mixtures with about 5% deviation in bubble point pressure. Extension to inhomogeneous systems and united-atom site types beyond those used in description of n-alkanes will be addressed in succeeding papers. PMID:24359349

  18. Perturbation theory based on the Variational Nodal Transport method in X-Y-Z geometry

    SciTech Connect

    Laurin-Kovitz, K.F.; Lewis, E.E.

    1995-07-01

    A perturbation method based on the Variational Nodal Method (VNM) of solving the neutron transport equation is developed for three-dimensional Cartesian geometry. The method utilizes the solution of the corresponding adjoint transport equation to calculate changes in the critical eigenvalue due to changes in cross sections. Both first order and exact perturbation theory expressions are derived. The adjoint solution algorithm has been formulated and incorporated into the VNM option of the Argonne National Laboratory DEF3D production code. The perturbation method is currently implemented as a post-processor to the VNM option of the DIF3D code. To demonstrate the efficacy of the method, example perturbations are applied to the Takeda Benchmark Model 1. In the first perturbation example, the thermal capture cross section is increased within the core region. For the second perturbation example, the increase in the thermal capture cross section is applied in the control rod region. The resulting changes in the critical eigenvalue are obtained by direct calculation in the VNM and compared to the change approximated by the first order and exact theory expressions from the perturbation method. Exact perturbation theory results are inexcellent agreement with the actual eigenvalue differences calculated in the VNM. First order theory holds well for sufficiently small perturbations.

  19. Adapting SAFT-γ perturbation theory to site-based molecular dynamics simulation. I. Homogeneous fluids

    SciTech Connect

    Ghobadi, Ahmadreza F.; Elliott, J. Richard

    2013-12-21

    In this work, we aim to develop a version of the Statistical Associating Fluid Theory (SAFT)-γ equation of state (EOS) that is compatible with united-atom force fields, rather than experimental data. We rely on the accuracy of the force fields to provide the relation to experimental data. Although, our objective is a transferable theory of interfacial properties for soft and fused heteronuclear chains, we first clarify the details of the SAFT-γ approach in terms of site-based simulations for homogeneous fluids. We show that a direct comparison of Helmholtz free energy to molecular simulation, in the framework of a third order Weeks-Chandler-Andersen perturbation theory, leads to an EOS that takes force field parameters as input and reproduces simulation results for Vapor-Liquid Equilibria (VLE) calculations. For example, saturated liquid density and vapor pressure of n-alkanes ranging from methane to dodecane deviate from those of the Transferable Potential for Phase Equilibria (TraPPE) force field by about 0.8% and 4%, respectively. Similar agreement between simulation and theory is obtained for critical properties and second virial coefficient. The EOS also reproduces simulation data of mixtures with about 5% deviation in bubble point pressure. Extension to inhomogeneous systems and united-atom site types beyond those used in description of n-alkanes will be addressed in succeeding papers.

  20. Web-Based Learning Environment: A Theory-Based Design Process for Development and Evaluation

    ERIC Educational Resources Information Center

    Nam, Chang S.; Smith-Jackson, Tonya L.

    2007-01-01

    Web-based courses and programs have increasingly been developed by many academic institutions, organizations, and companies worldwide due to their benefits for both learners and educators. However, many of the developmental approaches lack two important considerations needed for implementing Web-based learning applications: (1) integration of the…

  1. Web-Based Learning Environment: A Theory-Based Design Process for Development and Evaluation

    ERIC Educational Resources Information Center

    Nam, Chang S.; Smith-Jackson, Tonya L.

    2007-01-01

    Web-based courses and programs have increasingly been developed by many academic institutions, organizations, and companies worldwide due to their benefits for both learners and educators. However, many of the developmental approaches lack two important considerations needed for implementing Web-based learning applications: (1) integration of the…

  2. A Research of Weapon System Storage Reliability Simulation Method Based on Fuzzy Theory

    NASA Astrophysics Data System (ADS)

    Shi, Yonggang; Wu, Xuguang; Chen, Haijian; Xu, Tingxue

    Aimed at the problem of the new, complicated weapon equipment system storage reliability analyze, the paper researched on the methods of fuzzy fault tree analysis and fuzzy system storage reliability simulation, discussed the path that regarded weapon system as fuzzy system, and researched the storage reliability of weapon system based on fuzzy theory, provided a method of storage reliability research for the new, complicated weapon equipment system. As an example, built up the fuzzy fault tree of one type missile control instrument based on function analysis, and used the method of fuzzy system storage reliability simulation to analyze storage reliability index of control instrument.

  3. Designing a Project-based Learning in a University with New Theory of Learning

    NASA Astrophysics Data System (ADS)

    Mima, Noyuri

    New theory of learning indicates “learning” is the process of interaction which occurs in social relationships within a community containing a multitude of things beyond any single individual. From this point of view, the “project-based learning” is one of new methods of teaching and learning at university. The method of project-based learning includes of team learning, team teaching, portfolio assessment, open space, and faculty development. This paper discusses potential of university to become a learning community with the method along with results of the educational practice at Future University-Hakodate.

  4. Complexity theory and the "puzzling" competencies: Systems-based Practice And Practice-based Learning explored.

    PubMed

    Gonnering, Russell S

    2010-01-01

    Of all the clinical competencies, the least understood are Systems-Based Practice and Practice-Based Learning and Improvement. With a shift to competency-based education and evaluation across the spectrum of surgical education and practice, a clear understanding of the power and utility of each competency is paramount. Health care operates as a complex adaptive system, with dynamics foreign to many health care professionals and educators. The adaptation and evolution of such a system is related directly to both the individual and the organizational learning of the agents within the system and knowledge management strategies. Far from being "difficult," Systems-Based Practice and Practice-Based Learning form the heart of quality improvement initiatives and future productivity advances in health care. PMID:20656610

  5. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    NASA Astrophysics Data System (ADS)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  6. Testing a Theory-Based Mobility Monitoring Protocol Using In-Home Sensors: A Feasibility Study

    PubMed Central

    Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J.

    2014-01-01

    Mobility is a key factor in the performance of many everyday tasks required for independent living as a person grows older. The purpose of this mixed methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assessing the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3 month and 6 month visits (examples: FES, GDS-SF, Mini-cog). Semi-structured interviews to characterize acceptability of the technology were conducted at 3 month and 6 month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation. PMID:23938159

  7. Promoting fruit and vegetable consumption. Testing an intervention based on the theory of planned behaviour.

    PubMed

    Kothe, E J; Mullan, B A; Butow, P

    2012-06-01

    This study evaluated the efficacy of a theory of planned behaviour (TPB) based intervention to increase fruit and vegetable consumption. The extent to which fruit and vegetable consumption and change in intake could be explained by the TPB was also examined. Participants were randomly assigned to two levels of intervention frequency matched for intervention content (low frequency n=92, high frequency n=102). Participants received TPB-based email messages designed to increase fruit and vegetable consumption, messages targeted attitude, subjective norm and perceived behavioural control (PBC). Baseline and post-intervention measures of TPB variables and behaviour were collected. Across the entire study cohort, fruit and vegetable consumption increased by 0.83 servings/day between baseline and follow-up. Intention, attitude, subjective norm and PBC also increased (p<.05). The TPB successfully modelled fruit and vegetable consumption at both time points but not behaviour change. The increase of fruit and vegetable consumption is a promising preliminary finding for those primarily interested in increasing fruit and vegetable consumption. However, those interested in theory development may have concerns about the use of this model to explain behaviour change in this context. More high quality experimental tests of the theory are needed to confirm this result. PMID:22349778

  8. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  9. Optimal control theory for quantum-classical systems: Ehrenfest molecular dynamics based on time-dependent density-functional theory

    NASA Astrophysics Data System (ADS)

    Castro, A.; Gross, E. K. U.

    2014-01-01

    We derive the fundamental equations of an optimal control theory for systems containing both quantum electrons and classical ions. The system is modeled with Ehrenfest dynamics, a non-adiabatic variant of molecular dynamics. The general formulation, that needs the fully correlated many-electron wavefunction, can be simplified by making use of time-dependent density-functional theory. In this case, the optimal control equations require some modifications that we will provide. The abstract general formulation is complemented with the simple example of the H_2^+ molecule in the presence of a laser field.

  10. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  11. A 3-D elasticity theory based model for acoustic radiation from multilayered anisotropic plates.

    PubMed

    Shen, C; Xin, F X; Lu, T J

    2014-05-01

    A theoretical model built upon three-dimensional elasticity theory is developed to investigate the acoustic radiation from multilayered anisotropic plates subjected to a harmonic point force excitation. Fourier transform technique and stationary phase method are combined to predict the far-field radiated sound pressure of one-side water immersed plate. Compared to equivalent single-layer plate models, the present model based on elasticity theory can differentiate radiated sound pressure between dry-side and wet-side excited cases, as well as discrepancies induced by different layer sequences for multilayered anisotropic plates. These results highlight the superiority of the present theoretical model especially for handling multilayered anisotropic structures. PMID:24815294

  12. Research on three-dimensional computer-generated holographic algorithm based on conformal geometry theory

    NASA Astrophysics Data System (ADS)

    Zhang, Yaping; Zhang, Jianqiang; Chen, Wei; Zhang, Jialing; Wang, Peng; Xu, Wei

    2013-11-01

    A novel method of three-dimensional computer-generated holographic algorithm based on conformal geometry theory was introduced. Firstly, three-dimensional object is transformed to a corresponding two-dimensional image via conformal transformation by making use of conformal geometry theory in mathematics, and then the Fresnel computer-generated hologram of the two-dimensional image is produced. During the reconstruction process, the three-dimensional reconstruction image of the original object is obtained by means of conformal coordinate transformation. The research results show that the algorithm introduced in this paper is characterized by fast running speed and good effects, providing a good reference for the research on three-dimensional computer-generated holographic algorithm.

  13. Dynamically Incremental K-means++ Clustering Algorithm Based on Fuzzy Rough Set Theory

    NASA Astrophysics Data System (ADS)

    Li, Wei; Wang, Rujing; Jia, Xiufang; Jiang, Qing

    Being classic K-means++ clustering algorithm only for static data, dynamically incremental K-means++ clustering algorithm (DK-Means++) is presented based on fuzzy rough set theory in this paper. Firstly, in DK-Means++ clustering algorithm, the formula of similar degree is improved by weights computed by using of the important degree of attributes which are reduced on the basis of rough fuzzy set theory. Secondly, new data only need match granular which was clustered by K-means++ algorithm or seldom new data is clustered by classic K-means++ algorithm in global data. In this way, that all data is re-clustered each time in dynamic data set is avoided, so the efficiency of clustering is improved. Throughout our experiments showing, DK-Means++ algorithm can objectively and efficiently deal with clustering problem of dynamically incremental data.

  14. Social judgment theory based model on opinion formation, polarization and evolution

    NASA Astrophysics Data System (ADS)

    Chau, H. F.; Wong, C. Y.; Chow, F. K.; Fung, Chi-Hang Fred

    2014-12-01

    The dynamical origin of opinion polarization in the real world is an interesting topic that physical scientists may help to understand. To properly model the dynamics, the theory must be fully compatible with findings by social psychologists on microscopic opinion change. Here we introduce a generic model of opinion formation with homogeneous agents based on the well-known social judgment theory in social psychology by extending a similar model proposed by Jager and Amblard. The agents’ opinions will eventually cluster around extreme and/or moderate opinions forming three phases in a two-dimensional parameter space that describes the microscopic opinion response of the agents. The dynamics of this model can be qualitatively understood by mean-field analysis. More importantly, first-order phase transition in opinion distribution is observed by evolving the system under a slow change in the system parameters, showing that punctuated equilibria in public opinion can occur even in a fully connected social network.

  15. Study of Spectroscopic Properties of Diatomic Molecules Based on High Orders of the Operator Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Bekhtereva, E. S.; Litvinovskaya, A. G.; Konov, I. A.; Gromova, O. V.; Chertavskikh, Yu. V.; Tse, Yang Fang; Ulenikov, O. N.

    2015-08-01

    The form of the effective Hamiltonian of a quantum system with allowance for corrections of arbitrary order for solving arbitrary quantum-mechanical problems with perturbation operator depending not only on the same coordinates as the operator of the zero approximation, but also on an arbitrary set of other coordinates whose derivative operators may not commute with each other, is retrieved based on the operator perturbation theory (the recurrence formulas for corrections of any arbitrary order of the operator perturbation theory are presented in the paper in the most general form). The general results obtained allow the special features of the effective operators of any polyatomic molecule to be investigated. As a first step, an arbitrary diatomic molecule is investigated. Isotopic relations among different spectroscopic parameters are derived for the parent molecule and its various isotopic modifications.

  16. Slender-Body Theory Based On Approximate Solution of the Transonic Flow Equation

    NASA Technical Reports Server (NTRS)

    Spreiter, John R.; Alksne, Alberta Y.

    1959-01-01

    Approximate solution of the nonlinear equations of the small disturbance theory of transonic flow are found for the pressure distribution on pointed slender bodies of revolution for flows with free-stream, Mach number 1, and for flows that are either purely subsonic or purely supersonic. These results are obtained by application of a method based on local linearization that was introduced recently in the analysis of similar problems in two-dimensional flows. The theory is developed for bodies of arbitrary shape, and specific results are given for cone-cylinders and for parabolic-arc bodies at zero angle of attack. All results are compared either with existing theoretical results or with experimental data.

  17. A data mining algorithm based on the rough sets theory and BP neural network

    NASA Astrophysics Data System (ADS)

    Jiang, Weijin; Xu, Yusheng; Xu, Yuhui

    2005-10-01

    As both rough sets theory and neural network in data mining have special advantages and exiting problems, this paper presented a combined algorithm based rough sets theory and BP neural network. This algorithm deducts data from data warehouse by using rough sets' deduct function, and then moves the deducted data to the BP neural network as training data. By data deduct, the expression of training will become clearer, and the scale of neural network can be simplified. At the same time, neural network can easy up rough set's sensitivity for noise data. This paper presents a cost function to express the relationship between the amount of training data and the precision of neural network, and to supply a standard for the change from rough set deduct to neural network training.

  18. [Mathematical exploration of essence of herbal properties based on "Three-Elements" theory].

    PubMed

    Jin, Rui; Zhao, Qian; Zhang, Bing

    2014-10-01

    Herbal property theory of traditional Chinese medicines is the theoretical guidance on authentication of medicinal plants, herborization, preparation of herbal medicines for decoction and clinical application, with important theoretical value and prac- tical significance. Our research team proposed the "three-element" theory for herbal properties for the first time, conducted a study by using combined methods of philology, chemistry, pharmacology and mathematics, and then drew the research conclusion that herbal properties are defined as the chemical compositions-based comprehensive expression with complex and multi-level (positive/negative) biological effects in specific organism state. In this paper, researchers made a systematic mathematical analysis in four aspects--the correlation between herbal properties and chemical component factors, the correlation between herbal properties and organism state fac- tor, the correlation between herbal properties and biological effect factor and the integration study of the three elements, proposed future outlook, and provided reference to mathematical studies and mathematical analysis of herbal properties. PMID:25751963

  19. Characterization of degeneration process in combustion instability based on dynamical systems theory

    NASA Astrophysics Data System (ADS)

    Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru

    2015-11-01

    We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011), 10.1063/1.3563577; Chaos 22, 043128 (2012), 10.1063/1.4766589]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics.

  20. Characterization of degeneration process in combustion instability based on dynamical systems theory.

    PubMed

    Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru

    2015-11-01

    We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011); Chaos 22, 043128 (2012)]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics. PMID:26651761

  1. Formula for the rms blur circle radius of Wolter telescope based on aberration theory

    NASA Technical Reports Server (NTRS)

    Shealy, David L.; Saha, Timo T.

    1990-01-01

    A formula for the rms blur circle for Wolter telescopes has been derived using the transverse ray aberration expressions of Saha (1985), Saha (1984), and Saha (1986). The resulting formula for the rms blur circle radius over an image plane and a formula for the surface of best focus based on third-, fifth-, and seventh-order aberration theory predict results in good agreement with exact ray tracing. It has also been shown that one of the two terms in the empirical formula of VanSpeybroeck and Chase (1972), for the rms blur circle radius of a Wolter I telescope can be justified by the aberration theory results. Numerical results are given comparing the rms blur radius and the surface of best focus vs the half-field angle computed by skew ray tracing and from analytical formulas for grazing incidence Wolter I-II telescopes and a normal incidence Cassegrain telescope.

  2. Control theory based airfoil design for potential flow and a finite volume discretization

    NASA Technical Reports Server (NTRS)

    Reuther, J.; Jameson, A.

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.

  3. The Beginnings of a Quantum GEM Gravity Theory Based on Path Integrals

    NASA Astrophysics Data System (ADS)

    Brandenburg, John

    2009-03-01

    The GEM theory (Brandenburg, 2007) can be described in both its relativistically covariant and Newtonian forms. The GEM theory is an alloy of the Sakharov (1968), model of gravity as due to the quantum ZPF (Zero Point Fluctuation) and the Kaluza-Klein theory (Klein, 1926) 5th dimensional approach. The GEM is based on the postulate of gravity fields as arrays of E×B drifts or Poynting fields, and the postulate that both EM and gravity fields separated from each other with the formation of a 5th dimension coincidentally with the separation of protons and electrons. The Maxwell-Einstein equations coupled via a hydrogen plasma are recovered. A quantum theory of gravity is sketched out based on path integral methods. Gravity fields being transformable to EM fields and thus not subject to graviton high frequency instability, Planck length problems are solved by tachyon induced dark energy, to compensate for high mass densities. It is found that the classical General Relativity concept of ``no prior geometry'' does not survive the rigors of a physically reasonable quantization, and a spacetime that is locally ``flat'' even at short length scales results. Finally the physical meaning of the ``hidden'' fifth dimension is discussed, where the fifth dimension is identified physically as charge. This dimension is seen as allowing the appearance of matter from the vacuum. Also the equation for the important constant that accompanies this process ? = (mp/me)1/2, where mp and me are the electron and proton masses respectively and where the formula ??ln?(?-1/2+1)+ln? results, will be briefly derived and discussed as allowing a lower action state for the universe.

  4. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    NASA Astrophysics Data System (ADS)

    Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

    2012-01-01

    SummaryUnderstanding streamflow patterns in space and time is important for improving flood and drought forecasting, water resources management, and predictions of ecological changes. Objectives of this work include (a) to characterize the spatial and temporal patterns of streamflow using information theory-based measures at two thoroughly-monitored agricultural watersheds located in different hydroclimatic zones with similar land use, and (b) to elucidate and quantify temporal and spatial scale effects on those measures. We selected two USDA experimental watersheds to serve as case study examples, including the Little River experimental watershed (LREW) in Tifton, Georgia and the Sleepers River experimental watershed (SREW) in North Danville, Vermont. Both watersheds possess several nested sub-watersheds and more than 30 years of continuous data records of precipitation and streamflow. Information content measures (metric entropy and mean information gain) and complexity measures (effective measure complexity and fluctuation complexity) were computed based on the binary encoding of 5-year streamflow and precipitation time series data. We quantified patterns of streamflow using probabilities of joint or sequential appearances of the binary symbol sequences. Results of our analysis illustrate that information content measures of streamflow time series are much smaller than those for precipitation data, and the streamflow data also exhibit higher complexity, suggesting that the watersheds effectively act as filters of the precipitation information that leads to the observed additional complexity in streamflow measures. Correlation coefficients between the information-theory-based measures and time intervals are close to 0.9, demonstrating the significance of temporal scale effects on streamflow patterns. Moderate spatial scale effects on streamflow patterns are observed with absolute values of correlation coefficients between the measures and sub-watershed area varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.

  5. An Implicational View of Self-Healing and Personality Change Based on Gendlin's Theory of Experiencing.

    ERIC Educational Resources Information Center

    Bohart, Arthur C.

    There is relatively little theory on how psychotherapy clients self-heal since most theories of therapy stress the magic of the therapist's interventions. Of the theories that exist, this paper briefly discusses Carl Rogers' theory of self-actualization; and the dialectical theories of Greenberg and his colleagues, Jenkins, and Rychlak. Gendlin's…

  6. A boundary-continuous-displacement based Fourier analysis of laminated doubly-curved panels using classical shallow shell theories

    NASA Astrophysics Data System (ADS)

    Chaudhuri, Reaz A.; Kabir, Humayun R.

    1992-11-01

    A new methodology based on classical shallow shell theories is presented for solution to the static response and eigenvalue problems, involving a system of one fourth-order and two third-order highly coupled linear partial differential equations with the SS2-type simply supported boundary conditions. A comparison with solutions based on the first-order shear deformation theory made it possible to establish the upper limit of validity of the present classical lamination theory (CLT) based natural frequencies for angle-ply panels. Data obtained confirmed that introduction of transverse shear stress resultants into the two surface-parallel force equilibrium equations without concomitant changes in the kinematic relations constitutes little improvement over Donnell's or Sanders' shell theories, and all four classical shallow shell theories furnish virtually indistinguishable numerical results.

  7. Evaluating Art Studio Courses at Sultan Qaboos University in Light of the Discipline Based Art Education Theory

    ERIC Educational Resources Information Center

    Al-Amri, Mohammed

    2010-01-01

    Discipline-Based Art Education (DBAE), a theory developed in the USA, has been influential but also used in Art Education institutions world-wide. One of its stated goals was to develop the quality of teaching art education. Today, it is used as a theory for identifying and assessing good practices in the field of Art Education. The purpose of…

  8. Simulating Single Word Processing in the Classic Aphasia Syndromes Based on the Wernicke-Lichtheim-Geschwind Theory

    ERIC Educational Resources Information Center

    Weems, Scott A.; Reggia, James A.

    2006-01-01

    The Wernicke-Lichtheim-Geschwind (WLG) theory of the neurobiological basis of language is of great historical importance, and it continues to exert a substantial influence on most contemporary theories of language in spite of its widely recognized limitations. Here, we suggest that neurobiologically grounded computational models based on the WLG…

  9. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

    ERIC Educational Resources Information Center

    Fukuhara, Hirotaka; Kamata, Akihito

    2011-01-01

    A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…

  10. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    SciTech Connect

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  11. The Conceptual Mechanism for Viable Organizational Learning Based on Complex System Theory and the Viable System Model

    ERIC Educational Resources Information Center

    Sung, Dia; You, Yeongmahn; Song, Ji Hoon

    2008-01-01

    The purpose of this research is to explore the possibility of viable learning organizations based on identifying viable organizational learning mechanisms. Two theoretical foundations, complex system theory and viable system theory, have been integrated to provide the rationale for building the sustainable organizational learning mechanism. The…

  12. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

    ERIC Educational Resources Information Center

    Fukuhara, Hirotaka; Kamata, Akihito

    2011-01-01

    A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…

  13. An Alienation-Based Framework for Student Experience in Higher Education: New Interpretations of Past Observations in Student Learning Theory

    ERIC Educational Resources Information Center

    Barnhardt, Bradford; Ginns, Paul

    2014-01-01

    This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state…

  14. Motivational Measure of the Instruction Compared: Instruction Based on the ARCS Motivation Theory vs Traditional Instruction in Blended Courses

    ERIC Educational Resources Information Center

    Colakoglu, Ozgur M.; Akdemir, Omur

    2012-01-01

    The ARCS Motivation Theory was proposed to guide instructional designers and teachers who develop their own instruction to integrate motivational design strategies into the instruction. There is a lack of literature supporting the idea that instruction for blended courses if designed based on the ARCS Motivation Theory provides different…

  15. The Effect of Training in Gifted Education on Elementary Classroom Teachers' Theory-Based Reasoning about the Concept of Giftedness

    ERIC Educational Resources Information Center

    Miller, Erin Morris

    2009-01-01

    Classroom teachers play an important role in the identification of gifted students through teacher recommendations and referrals. This study is an investigation of teachers' theories of giftedness using methods adapted from those used to study theory-based reasoning in categorization research. In general, the teachers in this study focused on…

  16. Volume Averaging Theory (VAT) based modeling and closure evaluation for fin-and-tube heat exchangers

    NASA Astrophysics Data System (ADS)

    Zhou, Feng; Catton, Ivan

    2012-10-01

    A fin-and-tube heat exchanger was modeled based on Volume Averaging Theory (VAT) in such a way that the details of the original structure was replaced by their averaged counterparts, so that the VAT based governing equations can be efficiently solved for a wide range of parameters. To complete the VAT based model, proper closure is needed, which is related to a local friction factor and a heat transfer coefficient of a Representative Elementary Volume (REV). The terms in the closure expressions are complex and sometimes relating experimental data to the closure terms is difficult. In this work we use CFD to evaluate the rigorously derived closure terms over one of the selected REVs. The objective is to show how heat exchangers can be modeled as a porous media and how CFD can be used in place of a detailed, often formidable, experimental effort to obtain closure for the model.

  17. Josephson current in Fe-based superconducting junctions: Theory and experiment

    NASA Astrophysics Data System (ADS)

    Burmistrova, A. V.; Devyatov, I. A.; Golubov, Alexander A.; Yada, Keiji; Tanaka, Yukio; Tortello, M.; Gonnelli, R. S.; Stepanov, V. A.; Ding, Xiaxin; Wen, Hai-Hu; Greene, L. H.

    2015-06-01

    We present a theory of the dc Josephson effect in contacts between Fe-based and spin-singlet s -wave superconductors. The method is based on the calculation of temperature Green's function in the junction within the tight-binding model. We calculate the phase dependencies of the Josephson current for different orientations of the junction relative to the crystallographic axes of Fe-based superconductor. Further, we consider the dependence of the Josephson current on the thickness of an insulating layer and on temperature. Experimental data for PbIn/Ba 1 -xKx (FeAs) 2 point-contact Josephson junctions are consistent with theoretical predictions for s± symmetry of an order parameter in this material. The proposed method can be further applied to calculations of the dc Josephson current in contacts with other new unconventional multiorbital superconductors, such as Sr2RuO4 and the superconducting topological insulator CuxBi2Se3 .

  18. Theory of pollution washdown based on theoretical and experimental scavenging studies

    SciTech Connect

    Walcek, C.J.

    1983-01-01

    A fundamental theory which describes the absorption of trace gases into freely falling raindrops is presented. This theory is based on the convective diffusion equation and modified or verified according to results of detailed experimental studies of SO/sub 2/ absorption into raindrops. These experiments were performed utilizing the UCLA precipitation shaft. For drops with equivalent radii less than 500 ..mu..m, experimental results agreed favorably with theoretical predictions. For drops smaller than this, steady state convection and molecular diffusion appear to be the two dominant means of trace gas transport within a falling drop. Utilizing a modified theory which incorporated internal ventilation effect, a mass conserving model of pollution washdown by rainfall was developed. Testing this model on a gaussian plume of SO/sub 2/ at a height of 200 m, sensitivity studies were performed to determine the important factors which govern the rate of pollution washdown. It was found that the washout rate (% of trace gas in the layer removed by a millimeter of rain) was inversely proportional to the height, thickness, and peak concentration in the layer. Rainfall composed of larger drops (corresponding to a higher rainfall rate) was slightly less efficient at removing pollutant for the same amount of rain. The rate of sulfur oxidation within the rain was found to be one of the most important parameters which governs the overall washout rate.

  19. Coal bed 3D modeling based on set theory and unparallel ATP

    NASA Astrophysics Data System (ADS)

    Zhu, Qingwei

    2008-10-01

    Although spatial objects of our world have an intrinsic three dimensional (3D) natures, 3D data modeling and 3D data management have so far been neglected in spatial database systems and Geographical Information Systems, which map geometric data mainly to two-dimensional abstractions. But increasingly the third dimension becomes more and more relevant for application domains. Large volumes of 3D data require a treatment in a database context for representing, querying, and manipulating them efficiently. After detailed researching on the mechanism of Modeling of the Geology Body, a new compositive data model is brought forward based on the joining of set theory (for short ST) and Unparallel Analogical Triangular Prisms (for short UATP). Spatial entity is decomposed into five fundamental kinds of data types in this model, including 3D point (3DP), 3D line (3DL), 3D sample surface (3DSS), 3D surface (3DS), and 3D volume (3DV). Meanwhile, nine data structures concerned are put forward, including node, TIN edge, side edge, arc edge, TIN surface, sample surface, quadrangle, UATP, and 3DSVC. Based on this, system of modeling and simulation for spatial entity are designed. Fault and mining roadway are presented as examples. This paper aims at investigating the complex inherent features of 3D data and presents an abstract, formal data model called ST (Set Theory). The data model comprises a set of three-dimensional spatial data types together with a collection of geometric set operations. The result shows that the data model based on set theory and UATP can improve speed and accuracy degree during process modeling. So, the main point in this paper is reconstruction of 3D Geological models, other question, such as: topological relations, data volumes as a key question for further study.

  20. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

    PubMed Central

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  1. Allport's Intergroup Contact Theory as a theoretical base for impacting student attitudes in interprofessional education.

    PubMed

    Bridges, Diane R; Tomkowiak, John

    2010-01-01

    Interprofessional education has been defined as "members or students of two or more professionals associated with health or social care, engaged in learning with, from and about each other". Ideally, students trained using interprofessional education paradigms become interprofessional team members who gain respect and improve their attitudes about each other, and ultimately improve patient outcomes. However, it has been stated that before interprofessional education can claim its importance and successes, its impact must be critically evaluated. What theory can explain the impact that interprofessional education seems to have on changing students' attitudes of other professionals and positively affecting their performance as interprofessional healthcare team members? The authors of this paper suggest conditions identified in Gordon Allport's Contact Theory may be used as a theoretical base in interprofessional education to positively impact attitudinal change of students towards working as an interprofessional team member. For the purpose of this paper, equal status and common goals will be the two conditions highlighted as a theoretical base in interprofessional education. The premise to be explored in this paper is that utilizing a sound theoretical base in interprofessional education may positively impact students' attitudes towards working in interprofessional teams. PMID:20216998

  2. A feature extraction method based on information theory for fault diagnosis of reciprocating machinery.

    PubMed

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  3. Removing barriers to rehabilitation: Theory-based family intervention in community settings after brain injury.

    PubMed

    Stejskal, Taryn M

    2012-01-01

    Rehabilitation professionals have become increasingly aware that family members play a critical role in the recovery process of individuals after brain injury. In addition, researchers have begun to identify a relationship between family member caregivers' well-being and survivors' outcomes. The idea of a continuum of care or following survivors from inpatient care to community reintegration has become an important model of treatment across many hospital and community-based settings. In concert with the continuum of care, present research literature indicates that family intervention may be a key component to successful rehabilitation after brain injury. Yet, clinicians interacting with family members and survivors often feel confounded about how exactly to intervene with the broader family system beyond the individual survivor. Drawing on the systemic nature of the field of marriage and family therapy (MFT), this article provides information to assist clinicians in effectively intervening with families using theory-based interventions in community settings. First, a rationale for the utilization of systems-based, as opposed to individual-based, therapies will be uncovered. Second, historically relevant publications focusing on family psychotherapy and intervention after brain injury are reviewed and their implications discussed. Recommendations for the utilization of systemic theory-based principles and strategies, specifically cognitive behavioral therapy (CBT), narrative therapy (NT), and solution-focused therapy (SFT) will be examined. Descriptions of common challenges families and couples face will be presented along with case examples to illustrate how these theoretical frameworks might be applied to these special concerns postinjury. Finally, the article concludes with an overview of the ideas presented in this manuscript to assist practitioners and systems of care in community-based settings to more effectively intervene with the family system as a whole after brain injury. PMID:22523015

  4. Superior coexistence: systematicALLY regulatING land subsidence BASED on set pair theory

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Gong, S.-L.

    2015-11-01

    Anthropogenic land subsidence is an environmental side effect of exploring and using natural resources in the process of economic development. The key points of the system for controlling land subsidence include cooperation and superior coexistence while the economy develops, exploring and using natural resources, and geological environmental safety. Using the theory and method of set pair analysis (SPA), this article anatomises the factors, effects, and transformation of land subsidence. Based on the principle of superior coexistence, this paper promotes a technical approach to the system for controlling land subsidence, in order to improve the prevention and control of geological hazards.

  5. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  6. A rapid, universal TI-59 model-independent pharmacokinetic analysis program based on statistical moment theory.

    PubMed

    Reitberg, D P; Smith, I L; Love, S J; Lewin, H M; Schentag, J J

    1985-02-01

    A model-independent program for pharmacokinetic analyses based on statistical moment theory is presented and demonstrated. The program uses an inexpensive and portable TI-59; a PC-100A printer adds convenience but is optional. The program may be used in analysis of blood, serum, or plasma concentration vs. time curves originating from iv, im, po, sl, or sc administration. Drug input can be zero or first order; both single-dose and multiple-dose steady-state conditions can be evaluated. A comparison between results generated using moment analysis and traditional two-compartment nonlinear regression showed excellent agreement. PMID:3838276

  7. The Experimental Research on E-Learning Instructional Design Model Based on Cognitive Flexibility Theory

    NASA Astrophysics Data System (ADS)

    Cao, Xianzhong; Wang, Feng; Zheng, Zhongmei

    The paper reports an educational experiment on the e-Learning instructional design model based on Cognitive Flexibility Theory, the experiment were made to explore the feasibility and effectiveness of the model in promoting the learning quality in ill-structured domain. The study performed the experiment on two groups of students: one group learned through the system designed by the model and the other learned by the traditional method. The results of the experiment indicate that the e-Learning designed through the model is helpful to promote the intrinsic motivation, learning quality in ill-structured domains, ability to resolve ill-structured problem and creative thinking ability of the students.

  8. Unique laminar-flow stability limit based shallow-water theory

    USGS Publications Warehouse

    Chen, Cheng-lung

    1993-01-01

    Two approaches are generally taken in deriving the stability limit for the Froude member (Fs) for laminar sheet flow. The first approach used the Orr-Sommerfeld equation, while the second uses the cross-section-averaged equations of continuity and motion. Because both approaches are based on shallow-water theory, the values of Fs obtained from both approaches should be identical, yet in the literature they are not. This suggests that a defect exists in at least one of the two approaches. After examining the governing equations used in both approaches, one finds that the existing cross-section -averaged equation of motion is dependent on the frame of reference.

  9. The Stability Analysis for an Extended Car Following Model Based on Control Theory

    NASA Astrophysics Data System (ADS)

    Ge, Hong-Xia; Meng, Xiang-Pei; Zhu, Ke-Qiang; Cheng, Rong-Jun

    2014-08-01

    A new method is proposed to study the stability of the car-following model considering traffic interruption probability. The stability condition for the extended car-following model is obtained by using the Lyapunov function and the condition for no traffic jam is also given based on the control theory. Numerical simulations are conducted to demonstrate and verify the analytical results. Moreover, numerical simulations show that the traffic interruption probability has an influence on driving behavior and confirm the effectiveness of the method on the stability of traffic flow.

  10. Design optical antenna and fiber coupling system based on the vector theory of reflection and refraction.

    PubMed

    Jiang, Ping; Yang, Huajun; Mao, Shengqian

    2015-10-01

    A Cassegrain antenna system and an optical fiber coupling system which consists of a plano-concave lens and a plano-convex lens are designed based on the vector theory of reflection and refraction, so as to improve the transmission performance of the optical antenna and fiber coupling system. Three-dimensional ray tracing simulation are performed and results of the optical aberrations calculation and the experimental test show that the aberrations caused by on-axial defocusing, off-axial defocusing and deflection of receiving antenna can be well corrected by the optical fiber coupling system. PMID:26480125

  11. Constraints on Neutron Star Radii Based on Chiral Effective Field Theory Interactions

    SciTech Connect

    Hebeler, K.; Lattimer, J. M.; Pethick, C. J.; Schwenk, A.

    2010-10-15

    We show that microscopic calculations based on chiral effective field theory interactions constrain the properties of neutron-rich matter below nuclear densities to a much higher degree than is reflected in commonly used equations of state. Combined with observed neutron star masses, our results lead to a radius R=9.7-13.9 km for a 1.4M{sub {center_dot}} star, where the theoretical range is due, in about equal amounts, to uncertainties in many-body forces and to the extrapolation to high densities.

  12. Evaluation of a preschool nutrition education program based on the theory of multiple intelligences.

    PubMed

    Cason, K L

    2001-01-01

    This report describes the evaluation of a preschool nutrition education program based on the theory of multiple intelligences. Forty-six nutrition educators provided a series of 12 lessons to 6102 preschool-age children. The program was evaluated using a pretest/post-test design to assess differences in fruit and vegetable identification, healthy snack choices, willingness to taste foods, and eating behaviors. Subjects showed significant improvement in food identification and recognition, healthy snack identification, willingness to taste foods, and frequency of fruit, vegetable, meat, and dairy consumption. The evaluation indicates that the program was an effective approach for educating preschool children about nutrition. PMID:11953232

  13. A description of the mechanical behavior of composite solid propellants based on molecular theory

    NASA Technical Reports Server (NTRS)

    Landel, R. F.

    1976-01-01

    Both the investigation and the representation of the stress-strain response (including rupture) of gum and filled elastomers can be based on a simple functional statement. Internally consistent experiments are used to sort out the effects of time, temperature, strain and crosslink density on gum rubbers. All effects are readily correlated and shown to be essentially independent of the elastomer when considered in terms of non-dimensionalized stress, strain and time. A semiquantitative molecular theory is developed to explain this result. The introduction of fillers modifies the response, but, guided by the framework thus provided, their effects can be readily accounted for.

  14. The Study of Relationship and Strategy Between New Energy and Economic Development Based on Decoupling Theory

    NASA Astrophysics Data System (ADS)

    Liu, Jun; Xu, Hui; Liu, Yaping; Xu, Yang

    With the increasing pressure in energy conservation and emissions reduction, the new energy revolution in China is imminent. The implementation of electric energy substitution and cleaner alternatives is an important way to resolve the contradiction among economic growth, energy saving and emission reduction. This article demonstrates that China is in the second stage which energy consumption and GDP is increasing together with the reducing of energy consumption intensity based on the theory of decoupling. At the same time, new energy revolution needs to be realized through the increasing of the carbon productivity and the proportion of new energy.

  15. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  16. Incompatibility of the observer-based vacuum with canonical quantum field theory

    SciTech Connect

    Letaw, J.R.; Pfautsch, J.D.

    1980-09-01

    Three new stationary coordinate systems in flat space-time, the last of six possible types, are described. The vacuum state found by canonical quantization of the scalar field in two of these systems is identical to that of Rindler coordinates, yet detectors at rest in these systems are excited in different amounts. A definition of the vacuum based on these detectors is, therefore, incompatible with canonical quantum field theory. The vacuum of the third system is identical to the Minkowski coordinate vacuum; thus, there are only two different vacua in flat space-time.

  17. Development of new tip-loss corrections based on vortex theory and vortex methods

    NASA Astrophysics Data System (ADS)

    Branlard, Emmanuel; Gaunaa, Mac

    2014-12-01

    A new analytical formulation of the tip-loss factor is established based on helical vortex filament solutions. The derived tip-loss factor can be applied to wind-turbines, propellers or other rotary wings. Similar numerical formulations are used to assess the influence of wake expansion on tip-losses. Theodorsen's theory is successfully applied for the first time to assess the wake expansion behind a wind turbine. The tip-loss corrections obtained are compared with the ones from Prandtl and Glauert and implemented within a new Blade Element Momentum(BEM) code. Wake expansion is seen to reduce tip-losses and have a greater influence than wake distortion.

  18. Spatial registration of digital brain atlases based on fuzzy set theory.

    PubMed

    Berks, G; Ghassemi, A; von Keyserlingk, D G

    2001-01-01

    We present a semiautomatic method based on fuzzy set theory for adjusting a computerized brain atlas to magnetic resonance images (MRIs) of the human cerebral cortex. The atlas was registered to three-dimensional MRI data sets of 10 healthy volunteers. After a global matching using the external contour of the brain, several local procedures were performed regarding selected primary furrows and cytoarchitectonic areas. The final transformation matrix was calculated with respect to these anatomical structures and to their local matrices. Evaluation revealed an increase in accuracy as expressed by a reduction of the visible mismatch with respect to the registration of cortical and subcortical brain structures. PMID:11120403

  19. A simple laminate theory using the orthotropic viscoplasticity theory based on overstress. I - In-plane stress-strain relationships for metal matrix composites

    NASA Technical Reports Server (NTRS)

    Krempl, Erhard; Hong, Bor Zen

    1989-01-01

    A macromechanics analysis is presented for the in-plane, anisotropic time-dependent behavior of metal matrix laminates. The small deformation, orthotropic viscoplasticity theory based on overstress represents lamina behavior in a modified simple laminate theory. Material functions and constants can be identified in principle from experiments with laminae. Orthotropic invariants can be repositories for tension-compression asymmetry and for linear elasticity in one direction while the other directions behave in a viscoplastic manner. Computer programs are generated and tested for either unidirectional or symmetric laminates under in-plane loading. Correlations with the experimental results on metal matrix composites are presented.

  20. The Advancement of Family Therapy Theory Based on the Science of Self-Organizing Complex Systems.

    NASA Astrophysics Data System (ADS)

    Ramsey-Kemper, Valerie Ann

    1995-01-01

    Problem. The purpose of this study was to review the literature which presents the latest advancements in the field of family therapy theory. Since such advancement has relied on the scientific developments in the study of autopoietic self-organizing complex systems, then the review began with an historical overview of the development of these natural scientific concepts. The study then examined how the latest scientific concepts have been integrated with family therapy practice. The document is built on the theory that individuals are living, complex, self-organizing, autopoietic systems. When individual systems interact with other individual systems (such as in family interaction, or in interaction between therapist and client), then a third system emerges, which is the relationship. It is through interaction in the relationship that transformation of an individual system can occur. Method. The historical antecedents of the field of family therapy were outlined. It was demonstrated, via literature review, that the field of family therapy has traditionally paralleled developments in the hard sciences. Further, it was demonstrated via literature review that the newest understandings of the development of individuals, family systems, and therapeutic systems also parallel recent natural science developments, namely those developments based on the science of self-organizing complex systems. Outcome. The results of the study are twofold. First, the study articulates an expanded theory of the therapist, individual, and family as autopoietic self-organizing complex systems. Second, the study provides an expanded hypothesis which concerns recommendations for future research which will further advance the latest theories of family therapy. More precisely, the expanded hypothesis suggests that qualitative research, rather than quantitative research, is the method of choice for studying the effectiveness of phenomenological therapy.

  1. Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain

    PubMed Central

    Brette, Romain

    2015-01-01

    Does the brain use a firing rate code or a spike timing code? Considering this controversial question from an epistemological perspective, I argue that progress has been hampered by its problematic phrasing. It takes the perspective of an external observer looking at whether those two observables vary with stimuli, and thereby misses the relevant question: which one has a causal role in neural activity? When rephrased in a more meaningful way, the rate-based view appears as an ad hoc methodological postulate, one that is practical but with virtually no empirical or theoretical support. PMID:26617496

  2. Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain.

    PubMed

    Brette, Romain

    2015-01-01

    Does the brain use a firing rate code or a spike timing code? Considering this controversial question from an epistemological perspective, I argue that progress has been hampered by its problematic phrasing. It takes the perspective of an external observer looking at whether those two observables vary with stimuli, and thereby misses the relevant question: which one has a causal role in neural activity? When rephrased in a more meaningful way, the rate-based view appears as an ad hoc methodological postulate, one that is practical but with virtually no empirical or theoretical support. PMID:26617496

  3. A Monte Carlo exploration of threefold base geometries for 4d F-theory vacua

    NASA Astrophysics Data System (ADS)

    Taylor, Washington; Wang, Yi-Nan

    2016-01-01

    We use Monte Carlo methods to explore the set of toric threefold bases that support elliptic Calabi-Yau fourfolds for F-theory compactifications to four dimensions, and study the distribution of geometrically non-Higgsable gauge groups, matter, and quiver structure. We estimate the number of distinct threefold bases in the connected set studied to be ˜ 1048. The distribution of bases peaks around h 1,1 ˜ 82. All bases encountered after "thermalization" have some geometric non-Higgsable structure. We find that the number of non-Higgsable gauge group factors grows roughly linearly in h 1,1 of the threefold base. Typical bases have ˜ 6 isolated gauge factors as well as several larger connected clusters of gauge factors with jointly charged matter. Approximately 76% of the bases sampled contain connected two-factor gauge group products of the form SU(3) × SU(2), which may act as the non-Abelian part of the standard model gauge group. SU(3) × SU(2) is the third most common connected two-factor product group, following SU(2) × SU(2) and G 2 × SU(2), which arise more frequently.

  4. A theory-based evaluation of a community-based funding scheme in a disadvantaged suburban city area.

    PubMed

    Hickey, Gráinne; McGilloway, Sinead; O'Brien, Morgan; Leckey, Yvonne; Devlin, Maurice

    2015-10-01

    Community-driven development (CDD) initiatives frequently involve funding schemes which are aimed at channelling financial investment into local need and fostering community participation and engagement. This exploratory study examined, through a program theory approach, the design and implementation of a small-scale, community-based fund in Ireland. Observations, documentary analysis, interviews and group discussions with 19 participants were utilized to develop a detailed understanding of the program mechanisms, activities and processes, as well as the experiences of key stakeholders engaged with the funding scheme and its implementation. The findings showed that there were positive perceptions of the scheme and its function within the community. Overall, the availability of funding was perceived by key stakeholders as being beneficial. However, there were concerns over the accessibility of the scheme for more marginalized members of the community, as well as dissatisfaction with the openness and transparency surrounding funding eligibility. Lessons for the implementation of small-scale CDD funds are elaborated and the utility of program theory approaches for evaluators and planners working with programs that fund community-based initiatives is outlined. PMID:25933408

  5. A general theory of evolution based on energy efficiency: its implications for diseases.

    PubMed

    Yun, Anthony J; Lee, Patrick Y; Doux, John D; Conley, Buford R

    2006-01-01

    We propose a general theory of evolution based on energy efficiency. Life represents an emergent property of energy. The earth receives energy from cosmic sources such as the sun. Biologic life can be characterized by the conversion of available energy into complex systems. Direct energy converters such as photosynthetic microorganisms and plants transform light energy into high-energy phosphate bonds that fuel biochemical work. Indirect converters such as herbivores and carnivores predominantly feed off the food chain supplied by these direct converters. Improving energy efficiency confers competitive advantage in the contest among organisms for energy. We introduce a term, return on energy (ROE), as a measure of energy efficiency. We define ROE as a ratio of the amount of energy acquired by a system to the amount of energy consumed to generate that gain. Life-death cycling represents a tactic to sample the environment for innovations that allow increases in ROE to develop over generations rather than an individual lifespan. However, the variation-selection strategem of Darwinian evolution may define a particular tactic rather than an overarching biological paradigm. A theory of evolution based on competition for energy and driven by improvements in ROE both encompasses prior notions of evolution and portends post-Darwinian mechanisms. Such processes may involve the exchange of non-genetic traits that improve ROE, as exemplified by cognitive adaptations or memes. Under these circumstances, indefinite persistence may become favored over life-death cycling, as increases in ROE may then occur more efficiently within a single lifespan rather than over multiple generations. The key to this transition may involve novel methods to address the promotion of health and cognitive plasticity. We describe the implications of this theory for human diseases. PMID:16122878

  6. Theory of chemical kinetics and charge transfer based on nonequilibrium thermodynamics.

    PubMed

    Bazant, Martin Z

    2013-05-21

    Advances in the fields of catalysis and electrochemical energy conversion often involve nanoparticles, which can have kinetics surprisingly different from the bulk material. Classical theories of chemical kinetics assume independent reactions in dilute solutions, whose rates are determined by mean concentrations. In condensed matter, strong interactions alter chemical activities and create variations that can dramatically affect the reaction rate. The extreme case is that of a reaction coupled to a phase transformation, whose kinetics must depend not only on the order parameter but also on its gradients at phase boundaries. Reaction-driven phase transformations are common in electrochemistry, when charge transfer is accompanied by ion intercalation or deposition in a solid phase. Examples abound in Li-ion, metal-air, and lead-acid batteries, as well as metal electrodeposition-dissolution. Despite complex thermodynamics, however, the standard kinetic model is the Butler-Volmer equation, based on a dilute solution approximation. The Marcus theory of charge transfer likewise considers isolated reactants and neglects elastic stress, configurational entropy, and other nonidealities in condensed phases. The limitations of existing theories recently became apparent for the Li-ion battery material LixFePO4 (LFP). It has a strong tendency to separate into Li-rich and Li-poor solid phases, which scientists believe limits its performance. Chemists first modeled phase separation in LFP as an isotropic "shrinking core" within each particle, but experiments later revealed striped phase boundaries on the active crystal facet. This raised the question: What is the reaction rate at a surface undergoing a phase transformation? Meanwhile, dramatic rate enhancement was attained with LFP nanoparticles, and classical battery models could not predict the roles of phase separation and surface modification. In this Account, I present a general theory of chemical kinetics, developed over the past 7 years, which is capable of answering these questions. The reaction rate is a nonlinear function of the thermodynamic driving force, the free energy of reaction, expressed in terms of variational chemical potentials. The theory unifies and extends the Cahn-Hilliard and Allen-Cahn equations through a master equation for nonequilibrium chemical thermodynamics. For electrochemistry, I have also generalized both Marcus and Butler-Volmer kinetics for concentrated solutions and ionic solids. This new theory provides a quantitative description of LFP phase behavior. Concentration gradients and elastic coherency strain enhance the intercalation rate. At low currents, the charge-transfer rate is focused on exposed phase boundaries, which propagate as "intercalation waves", nucleated by surface wetting. Unexpectedly, homogeneous reactions are favored above a critical current and below a critical size, which helps to explain the rate capability of LFP nanoparticles. Contrary to other mechanisms, elevated temperatures and currents may enhance battery performance and lifetime by suppressing phase separation. The theory has also been extended to porous electrodes and could be used for battery engineering with multiphase active materials. More broadly, the theory describes nonequilibrium chemical systems at mesoscopic length and time scales, beyond the reach of molecular simulations and bulk continuum models. The reaction rate is consistently defined for inhomogeneous, nonequilibrium states, for example, with phase separation, large electric fields, or mechanical stresses. This research is also potentially applicable to fluid extraction from nanoporous solids, pattern formation in electrophoretic deposition, and electrochemical dynamics in biological cells. PMID:23520980

  7. Non-Markovian theories based on a decomposition of the spectral density.

    PubMed

    Kleinekathöfer, Ulrich

    2004-08-01

    For the description of dynamical effects in quantum mechanical systems on ultrashort time scales, memory effects play an important role. Meier and Tannor [J. Chem. Phys. 111, 3365 (1999)] developed an approach which is based on a time-nonlocal scheme employing a numerical decomposition of the spectral density. Here we propose two different approaches which are based on a partial time-ordering prescription, i.e., a time-local formalism and also on a numerical decomposition of the spectral density. In special cases such as the Debye spectral density the present scheme can be employed even without the numerical decomposition of the spectral density. One of the proposed schemes is valid for time-independent Hamiltonians and can be given in a compact quantum master equation. In the case of time-dependent Hamiltonians one has to introduce auxiliary operators which have to be propagated in time along with the density matrix. For the example of a damped harmonic oscillator these non-Markovian theories are compared among each other, to the Markovian limit neglecting memory effects and time dependencies, and to exact path integral calculations. Good agreement between the exact calculations and the non-Markovian results is obtained. Some of the non-Markovian theories mentioned above treat the time dependence in the system Hamiltonians nonperturbatively. Therefore these methods can be used for the simulation of experiments with arbitrary large laser fields. PMID:15281847

  8. Risk Assessment and Hierarchical Risk Management of Enterprises in Chemical Industrial Parks Based on Catastrophe Theory

    PubMed Central

    Chen, Yu; Song, Guobao; Yang, Fenglin; Zhang, Shushen; Zhang, Yun; Liu, Zhenyu

    2012-01-01

    According to risk systems theory and the characteristics of the chemical industry, an index system was established for risk assessment of enterprises in chemical industrial parks (CIPs) based on the inherent risk of the source, effectiveness of the prevention and control mechanism, and vulnerability of the receptor. A comprehensive risk assessment method based on catastrophe theory was then proposed and used to analyze the risk levels of ten major chemical enterprises in the Songmu Island CIP, China. According to the principle of equal distribution function, the chemical enterprise risk level was divided into the following five levels: 1.0 (very safe), 0.8 (safe), 0.6 (generally recognized as safe, GRAS), 0.4 (unsafe), 0.2 (very unsafe). The results revealed five enterprises (50%) with an unsafe risk level, and another five enterprises (50%) at the generally recognized as safe risk level. This method solves the multi-objective evaluation and decision-making problem. Additionally, this method involves simple calculations and provides an effective technique for risk assessment and hierarchical risk management of enterprises in CIPs. PMID:23208298

  9. Risk assessment and hierarchical risk management of enterprises in chemical industrial parks based on catastrophe theory.

    PubMed

    Chen, Yu; Song, Guobao; Yang, Fenglin; Zhang, Shushen; Zhang, Yun; Liu, Zhenyu

    2012-12-01

    According to risk systems theory and the characteristics of the chemical industry, an index system was established for risk assessment of enterprises in chemical industrial parks (CIPs) based on the inherent risk of the source, effectiveness of the prevention and control mechanism, and vulnerability of the receptor. A comprehensive risk assessment method based on catastrophe theory was then proposed and used to analyze the risk levels of ten major chemical enterprises in the Songmu Island CIP, China. According to the principle of equal distribution function, the chemical enterprise risk level was divided into the following five levels: 1.0 (very safe), 0.8 (safe), 0.6 (generally recognized as safe, GRAS), 0.4 (unsafe), 0.2 (very unsafe). The results revealed five enterprises (50%) with an unsafe risk level, and another five enterprises (50%) at the generally recognized as safe risk level. This method solves the multi-objective evaluation and decision-making problem. Additionally, this method involves simple calculations and provides an effective technique for risk assessment and hierarchical risk management of enterprises in CIPs. PMID:23208298

  10. Massive Yang-Mills theory based on the nonlinearly realized gauge group

    SciTech Connect

    Bettinelli, D.; Ferrari, R.; Quadri, A.

    2008-02-15

    We propose a subtraction scheme for a massive Yang-Mills theory realized via a nonlinear representation of the gauge group [here SU(2)]. It is based on the subtraction of the poles in D-4 of the amplitudes, in dimensional regularization, after a suitable normalization has been performed. Perturbation theory is in the number of loops, and the procedure is stable under iterative subtraction of the poles. The unphysical Goldstone bosons, the Faddeev-Popov ghosts, and the unphysical mode of the gauge field are expected to cancel out in the unitarity equation. The spontaneous symmetry breaking parameter is not a physical variable. We use the tools already tested in the nonlinear sigma model: hierarchy in the number of Goldstone boson legs and weak-power-counting property (finite number of independent divergent amplitudes at each order). It is intriguing that the model is naturally based on the symmetry SU(2){sub L} local x SU(2){sub R} global. By construction the physical amplitudes depend on the mass and on the self-coupling constant of the gauge particle and moreover on the scale parameter of the radiative corrections. The Feynman rules are in the Landau gauge.

  11. Adapting evidence-based interventions using a common theory, practices, and principles.

    PubMed

    Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D

    2014-01-01

    Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed. PMID:24079747

  12. Looking to the future of new media in health marketing: deriving propositions based on traditional theories.

    PubMed

    Della, Lindsay J; Eroglu, Dogan; Bernhardt, Jay M; Edgerton, Erin; Nall, Janice

    2008-01-01

    Market trend data show that the media marketplace continues to rapidly evolve. Recent research shows that substantial portions of the U.S. media population are "new media" users. Today, more than ever before, media consumers are exposed to multiple media at the same point in time, encouraged to participate in media content generation, and challenged to learn, access, and use the new media that are continually entering the market. These media trends have strong implications for how consumers of health information access, process, and retain health-related knowledge. In this article we review traditional information processing models and theories of interpersonal and mass media access and consumption. We make several theory-based propositions for how traditional information processing and media consumption concepts will function as new media usage continues to increase. These propositions are supported by new media usage data from the Centers for Disease Control and Prevention's entry into the new media market (e.g., podcasting, virtual events, blogging, and webinars). Based on these propositions, we conclude by presenting both opportunities and challenges that public health communicators and marketers will face in the future. PMID:18935883

  13. Credibility theory based dynamic control bound optimization for reservoir flood limited water level

    NASA Astrophysics Data System (ADS)

    Jiang, Zhiqiang; Sun, Ping; Ji, Changming; Zhou, Jianzhong

    2015-10-01

    The dynamic control operation of reservoir flood limited water level (FLWL) can solve the contradictions between reservoir flood control and beneficial operation well, and it is an important measure to make sure the security of flood control and realize the flood utilization. The dynamic control bound of FLWL is a fundamental key element for implementing reservoir dynamic control operation. In order to optimize the dynamic control bound of FLWL by considering flood forecasting error, this paper took the forecasting error as a fuzzy variable, and described it with the emerging credibility theory in recent years. By combining the flood forecasting error quantitative model, a credibility-based fuzzy chance constrained model used to optimize the dynamic control bound was proposed in this paper, and fuzzy simulation technology was used to solve the model. The FENGTAN reservoir in China was selected as a case study, and the results show that, compared with the original operation water level, the initial operation water level (IOWL) of FENGTAN reservoir can be raised 4 m, 2 m and 5.5 m respectively in the three division stages of flood season, and without increasing flood control risk. In addition, the rationality and feasibility of the proposed forecasting error quantitative model and credibility-based dynamic control bound optimization model are verified by the calculation results of extreme risk theory.

  14. Dissemination of a theory-based online bone health program: Two intervention approaches.

    PubMed

    Nahm, Eun-Shim; Resnick, Barbara; Bellantoni, Michele; Zhu, Shijun; Brown, Clayton; Brennan, Patricia F; Charters, Kathleen; Brown, Jeanine; Rietschel, Matthew; Pinna, Joanne; An, Minjeong; Park, Bu Kyung; Plummer, Lisa

    2015-06-01

    With the increasing nationwide emphasis on eHealth, there has been a rapid growth in the use of the Internet to deliver health promotion interventions. Although there has been a great deal of research in this field, little information is available regarding the methodologies to develop and implement effective online interventions. This article describes two social cognitive theory-based online health behavior interventions used in a large-scale dissemination study (N = 866), their implementation processes, and the lessons learned during the implementation processes. The two interventions were a short-term (8-week) intensive online Bone Power program and a longer term (12-month) Bone Power Plus program, including the Bone Power program followed by a 10-month online booster intervention (biweekly eHealth newsletters). This study used a small-group approach (32 intervention groups), and to effectively manage those groups, an eLearning management program was used as an upper layer of the Web intervention. Both interventions were implemented successfully with high retention rates (80.7% at 18 months). The theory-based approaches and the online infrastructure used in this study showed a promising potential as an effective platform for online behavior studies. Further replication studies with different samples and settings are needed to validate the utility of this intervention structure. PMID:26021668

  15. Coding theory based models for protein translation initiation in prokaryotic organisms.

    SciTech Connect

    May, Elebeoba Eni; Bitzer, Donald L. (North Carolina State University, Raleigh, NC); Rosnick, David I. (North Carolina State University, Raleigh, NC); Vouk, Mladen A.

    2003-03-01

    Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

  16. Improving breast cancer control among Latinas: evaluation of a theory-based educational program.

    PubMed

    Mishra, S I; Chavez, L R; Magaña, J R; Nava, P; Burciaga Valdez, R; Hubbell, F A

    1998-10-01

    The study evaluated a theory-based breast cancer control program specially developed for less acculturated Latinas. The authors used a quasi-experimental design with random assignment of Latinas into experimental (n = 51) or control (n = 37) groups that completed one pretest and two posttest surveys. The experimental group received the educational program, which was based on Bandura's self-efficacy theory and Freire's empowerment pedagogy. Outcome measures included knowledge, perceived self-efficacy, attitudes, breast self-examination (BSE) skills, and mammogram use. At posttest 1, controlling for pretest scores, the experimental group was significantly more likely than the control group to have more medically recognized knowledge (sum of square [SS] = 17.0, F = 6.58, p < .01), have less medically recognized knowledge (SS = 128.8, F = 39.24, p < .001), greater sense of perceived self-efficacy (SS = 316.5, F = 9.63, p < .01), and greater adeptness in the conduct of BSE (SS = 234.8, F = 153.33, p < .001). Cancer control programs designed for less acculturated women should use informal and interactive educational methods that incorporate skill-enhancing and empowering techniques. PMID:9768384

  17. A Theory-Based Approach to Teaching Young Children about Health: A Recipe for Understanding

    ERIC Educational Resources Information Center

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children's concepts are integrated into theories. Concept-learning studies have documented the central role that theories play in children's learning of experimenter-defined categories but have yet to extensively examine complex, real-world concepts, such as health. The present study…

  18. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1998-01-01

    A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

  19. Predictive models based on sensitivity theory and their application to practical shielding problems

    SciTech Connect

    Bhuiyan, S.I.; Roussin, R.W.; Lucius, J.L.; Bartine, D.E.

    1983-01-01

    Two new calculational models based on the use of cross-section sensitivity coefficients have been devised for calculating radiation transport in relatively simple shields. The two models, one an exponential model and the other a power model, have been applied, together with the traditional linear model, to 1- and 2-m-thick concrete-slab problems in which the water content, reinforcing-steel content, or composition of the concrete was varied. Comparing the results obtained with the three models with those obtained from exact one-dimensional discrete-ordinates transport calculations indicates that the exponential model, named the BEST model (for basic exponential shielding trend), is a particularly promising predictive tool for shielding problems dominated by exponential attenuation. When applied to a deep-penetration sodium problem, the BEST model also yields better results than do calculations based on second-order sensitivity theory.

  20. Optimization of a photovoltaic pumping system based on the optimal control theory

    SciTech Connect

    Betka, A.; Attali, A.

    2010-07-15

    This paper suggests how an optimal operation of a photovoltaic pumping system based on an induction motor driving a centrifugal pump can be realized. The optimization problem consists in maximizing the daily pumped water quantity via the optimization of the motor efficiency for every operation point. The proposed structure allows at the same time the minimization the machine losses, the field oriented control and the maximum power tracking of the photovoltaic array. This will be attained based on multi-input and multi-output optimal regulator theory. The effectiveness of the proposed algorithm is described by simulation and the obtained results are compared to those of a system working with a constant air gap flux. (author)

  1. Re-Examining of Moffitt’s Theory of Delinquency through Agent Based Modeling

    PubMed Central

    Leaw, Jia Ning; Ang, Rebecca P.; Huan, Vivien S.; Chan, Wei Teng; Cheong, Siew Ann

    2015-01-01

    Moffitt’s theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

  2. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    PubMed

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/. PMID:26313379

  3. Buckling Analysis for Stiffened Anisotropic Circular Cylinders Based on Sanders Nonlinear Shell Theory

    NASA Technical Reports Server (NTRS)

    Nemeth, Michael P.

    2014-01-01

    Nonlinear and bifurcation buckling equations for elastic, stiffened, geometrically perfect, right-circular cylindrical, anisotropic shells subjected to combined loads are presented that are based on Sanders' shell theory. Based on these equations, a three-parameter approximate Rayleigh-Ritz solution and a classical solution to the buckling problem are presented for cylinders with simply supported edges. Extensive comparisons of results obtained from these solutions with published results are also presented for a wide range of cylinder constructions. These comparisons include laminated-composite cylinders with a wide variety of shell-wall orthotropies and anisotropies. Numerous results are also given that show the discrepancies between the results obtained by using Donnell's equations and variants of Sanders' equations. For some cases, nondimensional parameters are identified and "master" curves are presented that facilitate the concise representation of results.

  4. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies

    PubMed Central

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/. PMID:26313379

  5. Combinatorial density functional theory-based screening of surface alloys for the oxygen reduction reaction.

    SciTech Connect

    Greeley, J.; Norskov, J.; Center for Nanoscale Materials; Technical Univ. of Denmark

    2009-03-26

    A density functional theory (DFT) -based, combinatorial search for improved oxygen reduction reaction (ORR) catalysts is presented. A descriptor-based approach to estimate the ORR activity of binary surface alloys, wherein alloying occurs only in the surface layer, is described, and rigorous, potential-dependent computational tests of the stability of these alloys in aqueous, acidic environments are presented. These activity and stability criteria are applied to a database of DFT calculations on nearly 750 binary transition metal surface alloys; of these, many are predicted to be active for the ORR but, with few exceptions, they are found to be thermodynamically unstable in the acidic environments typical of low-temperature fuel cells. The results suggest that, absent other thermodynamic or kinetic mechanisms to stabilize the alloys, surface alloys are unlikely to serve as useful ORR catalysts over extended periods of operation.

  6. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

    PubMed

    Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

    2012-05-01

    The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, ?(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls. PMID:22471316

  7. Unit Template Synchronous Reference Frame Theory Based Control Algorithm for DSTATCOM

    NASA Astrophysics Data System (ADS)

    Bangarraju, J.; Rajagopal, V.; Jayalaxmi, A.

    2014-04-01

    This article proposes new and simplified unit templates instead of standard phase locked loop (PLL) for Synchronous Reference Frame Theory Control Algorithm (SRFT). The extraction of synchronizing components (sinθ and cosθ) for parks and inverse parks transformation using standard PLL takes more execution time. This execution time in control algorithm delays the extraction of reference source current generation. The standard PLL not only takes more execution time but also increases the reactive power burden on the Distributed Static Compensator (DSTATCOM). This work proposes a unit template based SRFT control algorithm for four-leg insulated gate bipolar transistor based voltage source converter for DSTATCOM in distribution systems. This will reduce the execution time and reactive power burden on the DSTATCOM. The proposed DSTATCOM suppress harmonics, regulates the terminal voltage along with neutral current compensation. The DSTATCOM in distribution systems with proposed control algorithm is modeled and simulated using MATLAB using SIMULINK and Simpower systems toolboxes.

  8. Optimal control of ICU patient discharge: from theory to implementation.

    PubMed

    Mallor, Fermín; Azcárate, Cristina; Barado, Julio

    2015-09-01

    This paper deals with the management of scarce health care resources. We consider a control problem in which the objective is to minimize the rate of patient rejection due to service saturation. The scope of decisions is limited, in terms both of the amount of resources to be used, which are supposed to be fixed, and of the patient arrival pattern, which is assumed to be uncontrollable. This means that the only potential areas of control are speed or completeness of service. By means of queuing theory and optimization techniques, we provide a theoretical solution expressed in terms of service rates. In order to make this theoretical analysis useful for the effective control of the healthcare system, however, further steps in the analysis of the solution are required: physicians need flexible and medically-meaningful operative rules for shortening patient length of service to the degree needed to give the service rates dictated by the theoretical analysis. The main contribution of this paper is to discuss how the theoretical solutions can be transformed into effective management rules to guide doctors' decisions. The study examines three types of rules based on intuitive interpretations of the theoretical solution. Rules are evaluated through implementation in a simulation model. We compare the service rates provided by the different policies with those dictated by the theoretical solution. Probabilistic analysis is also included to support rule validity. An Intensive Care Unit is used to illustrate this control problem. The study focuses on the Markovian case before moving on to consider more realistic LoS distributions (Weibull, Lognormal and Phase-type distribution). PMID:25763761

  9. Applying Constructivist and Objectivist Learning Theories in the Design of a Web-based Course: Implications for Practice.

    ERIC Educational Resources Information Center

    Moallem, Mahnaz

    2001-01-01

    Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…

  10. Einstein Critical-Slowing-Down is Siegel CyberWar Denial-of-Access Queuing/Pinning/ Jamming/Aikido Via Siegel DIGIT-Physics BEC ``Intersection''-BECOME-UNION Barabasi Network/GRAPH-Physics BEC: Strutt/Rayleigh-Siegel Percolation GLOBALITY-to-LOCALITY Phase-Transition Critical-Phenomenon

    NASA Astrophysics Data System (ADS)

    Buick, Otto; Falcon, Pat; Alexander, G.; Siegel, Edward Carl-Ludwig

    2013-03-01

    Einstein[Dover(03)] critical-slowing-down(CSD)[Pais, Subtle in The Lord; Life & Sci. of Albert Einstein(81)] is Siegel CyberWar denial-of-access(DOA) operations-research queuing theory/pinning/jamming/.../Read [Aikido, Aikibojitsu & Natural-Law(90)]/Aikido(!!!) phase-transition critical-phenomenon via Siegel DIGIT-Physics (Newcomb[Am.J.Math. 4,39(1881)]-{Planck[(1901)]-Einstein[(1905)])-Poincare[Calcul Probabilités(12)-p.313]-Weyl [Goett.Nachr.(14); Math.Ann.77,313 (16)]-{Bose[(24)-Einstein[(25)]-Fermi[(27)]-Dirac[(1927)]}-``Benford''[Proc.Am.Phil.Soc. 78,4,551 (38)]-Kac[Maths.Stat.-Reasoning(55)]-Raimi[Sci.Am. 221,109 (69)...]-Jech[preprint, PSU(95)]-Hill[Proc.AMS 123,3,887(95)]-Browne[NYT(8/98)]-Antonoff-Smith-Siegel[AMS Joint-Mtg.,S.-D.(02)] algebraic-inversion to yield ONLY BOSE-EINSTEIN QUANTUM-statistics (BEQS) with ZERO-digit Bose-Einstein CONDENSATION(BEC) ``INTERSECTION''-BECOME-UNION to Barabasi[PRL 876,5632(01); Rev.Mod.Phys.74,47(02)...] Network /Net/GRAPH(!!!)-physics BEC: Strutt/Rayleigh(1881)-Polya(21)-``Anderson''(58)-Siegel[J.Non-crystalline-Sol.40,453(80)

  11. A Neurosemantic Theory of Concrete Noun Representation Based on the Underlying Brain Codes

    PubMed Central

    Just, Marcel Adam; Cherkassky, Vladimir L.; Aryal, Sandesh; Mitchell, Tom M.

    2010-01-01

    This article describes the discovery of a set of biologically-driven semantic dimensions underlying the neural representation of concrete nouns, and then demonstrates how a resulting theory of noun representation can be used to identify simple thoughts through their fMRI patterns. We use factor analysis of fMRI brain imaging data to reveal the biological representation of individual concrete nouns like apple, in the absence of any pictorial stimuli. From this analysis emerge three main semantic factors underpinning the neural representation of nouns naming physical objects, which we label manipulation, shelter, and eating. Each factor is neurally represented in 3–4 different brain locations that correspond to a cortical network that co-activates in non-linguistic tasks, such as tool use pantomime for the manipulation factor. Several converging methods, such as the use of behavioral ratings of word meaning and text corpus characteristics, provide independent evidence of the centrality of these factors to the representations. The factors are then used with machine learning classifier techniques to show that the fMRI-measured brain representation of an individual concrete noun like apple can be identified with good accuracy from among 60 candidate words, using only the fMRI activity in the 16 locations associated with these factors. To further demonstrate the generativity of the proposed account, a theory-based model is developed to predict the brain activation patterns for words to which the algorithm has not been previously exposed. The methods, findings, and theory constitute a new approach of using brain activity for understanding how object concepts are represented in the mind. PMID:20084104

  12. Surface collision theory for suspension-based cleaning of particle-contaminated solid substrates

    NASA Astrophysics Data System (ADS)

    Andreev, V. A.; Prausnitz, J. M.; Radke, C. J.

    2011-03-01

    To quantify removal kinetics of contaminant particles on solid surfaces, we study collisions between nonspherical particles when one particle is suspended in laminar shear flow while the second is adhered to a solid surface. Based on kinetic theory of rigid nonspherical particles, we outline a theoretical framework for our previously developed binary-collision contaminant-removal model. We show that a distribution of adhered contaminant particles over orientation, size, and shape results in multiexponential decay of surface concentration of particles with time, in agreement with experimental findings [Andreev et al., J. Electrochem. Soc. 158, H55 (2011)]. Theory predicts a linear increase of removal rate constant with shear rate and with suspended solids concentration near the substrate surface, also in agreement with experiment [Andreev et al., J. Electrochem. Soc. 158, H55 (2011); Ind. Eng. Chem. Res. 49, 12461 (2010)]. To reveal the effect of geometry and size of colliding entrained particles on removal rates, an approximate singlet distribution function is derived for particles in flow at the level of the Smoluchowski theory for orthocoagulation. Two shapes of flow-suspended particles are considered: spheres and cuboids with high aspect ratio, while contaminant particles on the surface are small and spherical. Removal kinetic rate constants scale with contaminant particle size, aA, as aA3/2 for spheres and as aA for cuboids. Thus, rectangular platelet particles are effective for removal of small contaminant particles, confirming experimental observation [Andreev et al., J. Electrochem. Soc. 158, H55 (2011)]. The influence of platelet aspect ratio on removal rates is analyzed. Due to interplay between solids velocity and collision cross section, small aspect ratios improve cleaning efficiency when the size ratio of the entrained to contaminant particles is large.

  13. Contextualization and standardization of the supportive leadership behavior questionnaire based on socio- cognitive theory in Iran

    PubMed Central

    Shirazi, Mandana; Emami, Amir Hosein; Mirmoosavi, ,Seyed Jamal; Alavinia, Seyed Mohammad; Zamanian, Hadi; Fathollahbeigi, Faezeh; Masiello, Italo

    2014-01-01

    Background: Effective leadership is of prime importance in any organization and it goes through changes based on accepted health promotion and behavior change theory. Although there are many leadership styles, transformational leadership, which emphasizes supportive leadership behaviors, seems to be an appropriate style in many settings particularly in the health care and educational sectors which are pressured by high turnover and safety demands. Iran has been moving rapidly forward and its authorities have understood and recognized the importance of matching leadership styles with effective and competent care for success in health care organizations. This study aimed to develop the Supportive Leadership Behaviors Scale based on accepted health and educational theories and to psychometrically test it in the Iranian context. Methods: The instrument was based on items from established questionnaires. A pilot study validated the instrument which was also cross-validated via re-translation. After validation, 731 participants answered the questionnaire. Results: The instrument was finalized and resulted in a 20-item questionnaire using the exploratory factor analysis, which yielded four factors of support for development, integrity, sincerity and recognition and explaining the supportive leadership behaviors (all above 0.6). Mapping these four measures of leadership behaviors can be beneficial to determine whether effective leadership could support innovation and improvements in medical education and health care organizations on the national level. The reliability measured as Cronbach’s alpha was 0.84. Conclusion: This new instrument yielded four factors of support for development, integrity, sincerity and recognition and explaining the supportive leadership behaviors which are applicable in health and educational settings and are helpful in improving self –efficacy among health and academic staff. PMID:25679004

  14. Motivational cues predict the defensive system in team handball: A model based on regulatory focus theory.

    PubMed

    Debanne, T; Laffaye, G

    2015-08-01

    This study was based on the naturalistic decision-making paradigm and regulatory focus theory. Its aim was to model coaches' decision-making processes for handball teams' defensive systems based on relevant cues of the reward structure, and to determine the weight of each of these cues. We collected raw data by video-recording 41 games that were selected using a simple random method. We considered the defensive strategy (DEF: aligned or staged) to be the dependent variable, and the three independent variables were (a) numerical difference between the teams; (b) score difference between the teams; and (c) game periods. We used a logistic regression design (logit model) and a multivariate logistic model to explain the link between DEF and the three category independent variables. Each factor was weighted differently during the decision-making process to select the defensive system, and combining these variables increased the impact on this process; for instance, a staged defense is 43 times more likely to be chosen during the final period in an unfavorable situation and in a man advantage. Finally, this shows that the coach's decision-making process could be based on a simple match or could require a diagnosis of the situation based on the relevant cues. PMID:25262855

  15. Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.

    PubMed

    Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  16. Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm

    PubMed Central

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  17. Investigations into Generalization of Constraint-Based Scheduling Theories with Applications to Space Telescope Observation Scheduling

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Smith, Steven S.

    1996-01-01

    This final report summarizes research performed under NASA contract NCC 2-531 toward generalization of constraint-based scheduling theories and techniques for application to space telescope observation scheduling problems. Our work into theories and techniques for solution of this class of problems has led to the development of the Heuristic Scheduling Testbed System (HSTS), a software system for integrated planning and scheduling. Within HSTS, planning and scheduling are treated as two complementary aspects of the more general process of constructing a feasible set of behaviors of a target system. We have validated the HSTS approach by applying it to the generation of observation schedules for the Hubble Space Telescope. This report summarizes the HSTS framework and its application to the Hubble Space Telescope domain. First, the HSTS software architecture is described, indicating (1) how the structure and dynamics of a system is modeled in HSTS, (2) how schedules are represented at multiple levels of abstraction, and (3) the problem solving machinery that is provided. Next, the specific scheduler developed within this software architecture for detailed management of Hubble Space Telescope operations is presented. Finally, experimental performance results are given that confirm the utility and practicality of the approach.

  18. New Approach to Optimize the Apfs Placement Based on Instantaneous Reactive Power Theory by Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Hashemi-Dezaki, Hamed; Mohammadalizadeh-Shabestary, Masoud; Askarian-Abyaneh, Hossein; Rezaei-Jegarluei, Mohammad

    2014-01-01

    In electrical distribution systems, a great amount of power are wasting across the lines, also nowadays power factors, voltage profiles and total harmonic distortions (THDs) of most loads are not as would be desired. So these important parameters of a system play highly important role in wasting money and energy, and besides both consumers and sources are suffering from a high rate of distortions and even instabilities. Active power filters (APFs) are innovative ideas for solving of this adversity which have recently used instantaneous reactive power theory. In this paper, a novel method is proposed to optimize the allocation of APFs. The introduced method is based on the instantaneous reactive power theory in vectorial representation. By use of this representation, it is possible to asses different compensation strategies. Also, APFs proper placement in the system plays a crucial role in either reducing the losses costs and power quality improvement. To optimize the APFs placement, a new objective function has been defined on the basis of five terms: total losses, power factor, voltage profile, THD and cost. Genetic algorithm has been used to solve the optimization problem. The results of applying this method to a distribution network illustrate the method advantages.

  19. Informational Theory of Aging: The Life Extension Method Based on the Bone Marrow Transplantation

    PubMed Central

    Karnaukhov, Alexey V.; Karnaukhova, Elena V.; Sergievich, Larisa A.; Karnaukhova, Natalia A.; Bogdanenko, Elena V.; Manokhina, Irina A.; Karnaukhov, Valery N.

    2015-01-01

    The method of lifespan extension that is a practical application of the informational theory of aging is proposed. In this theory, the degradation (error accumulation) of the genetic information in cells is considered a main cause of aging. According to it, our method is based on the transplantation of genetically identical (or similar) stem cells with the lower number of genomic errors to the old recipients. For humans and large mammals, this method can be realized by cryopreservation of their own stem cells, taken in a young age, for the later autologous transplantation in old age. To test this method experimentally, we chose laboratory animals of relatively short lifespan (mouse). Because it is difficult to isolate the required amount of the stem cells (e.g., bone marrow) without significant damage for animals, we used the bone marrow transplantation from sacrificed inbred young donors. It is shown that the lifespan extension of recipients depends on level of their genetic similarity (syngeneity) with donors. We have achieved the lifespan increase of the experimental mice by 34% when the transplantation of the bone marrow with high level of genetic similarity was used. PMID:26491435

  20. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    DOE PAGESBeta

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less

  1. An integrated finite element simulation of cardiomyocyte function based on triphasic theory

    PubMed Central

    Hatano, Asuka; Okada, Jun-Ichi; Washio, Takumi; Hisada, Toshiaki; Sugiura, Seiryo

    2015-01-01

    In numerical simulations of cardiac excitation-contraction coupling, the intracellular potential distribution and mobility of cytosol and ions have been mostly ignored. Although the intracellular potential gradient is small, during depolarization it can be a significant driving force for ion movement, and is comparable to diffusion in terms of net flux. Furthermore, fluid in the t-tubules is thought to advect ions to facilitate their exchange with the extracellular space. We extend our previous finite element model that was based on triphasic theory to examine the significance of these factors in cardiac physiology. Triphasic theory allows us to study the behavior of solids (proteins), fluids (cytosol) and ions governed by mechanics and electrochemistry in detailed subcellular structures, including myofibrils, mitochondria, the sarcoplasmic reticulum, membranes, and t-tubules. Our simulation results predicted an electrical potential gradient inside the t-tubules at the onset of depolarization, which corresponded to the Na+ channel distribution therein. Ejection and suction of fluid between the t-tubules and the extracellular compartment during isometric contraction were observed. We also examined the influence of t-tubule morphology and mitochondrial location on the electrophysiology and mechanics of the cardiomyocyte. Our results confirm that the t-tubule structure is important for synchrony of Ca2+ release, and suggest that mitochondria in the sub-sarcolemmal region might serve to cancel Ca2+ inflow through surface sarcolemma, thereby maintaining the intracellular Ca2+ environment in equilibrium. PMID:26539124

  2. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  3. Gradient projection for reliability-based design optimization using evidence theory

    NASA Astrophysics Data System (ADS)

    Alyanak, Edward; Grandhi, Ramana; Bae, Ha-Rok

    2008-10-01

    Uncertainty quantification and risk assessment in the optimal design of structural systems has always been a critical consideration for engineers. When new technologies are developed or implemented and budgets are limited for full-scale testing, the result is insufficient datasets for construction of probability distributions. Making assumptions about these probability distributions can potentially introduce more uncertainty to the system than it quantifies. Evidence theory represents a method to handle epistemic uncertainty that represents a lack of knowledge or information in the numerical optimization process. Therefore, it is a natural tool to use for uncertainty quantification and risk assessment especially in the optimization design cycle for future aerospace structures where new technologies are being applied. For evidence theory to be recognized as a useful tool, it must be efficiently applied in a robust design optimization scheme. This article demonstrates a new method for projecting the reliability gradient, based on the measures of belief and plausibility, without gathering any excess information other than what is required to determine these measures. This represents a huge saving in computational time over other methods available in the current literature. The technique developed in this article is demonstrated with three optimization examples.

  4. Effective meson masses in nuclear matter based on a cutoff field theory

    SciTech Connect

    Nakano, M.; Noda, N.; Mitsumori, T.; Koide, K.; Kouno, H.; Hasegawa, A.

    1997-02-01

    Effective masses of {sigma}, {omega}, {pi}, and {rho} mesons in nuclear matter are calculated based on a cutoff field theory. Instead of the traditional density-Feynman representation, we adopt the particle-hole-antiparticle representation for nuclear propagators so that unphysical components are not included in the meson self-energies. For an estimation of the contribution from the divergent particle-antiparticle excitations, i.e., vacuum polarization in nuclear matter, the idea of the renormalization group method is adopted. In this cutoff field theory, all the counterterms are finite and calculated numerically. It is shown that the predicted meson masses converge even if the cutoff {Lambda} is changed as long as {Lambda} is sufficiently large and that the prescription works well also for so-called nonrenormalized mesons such as {pi} and {rho}. According to this method, it is concluded that meson masses in nuclear matter have a weak dependence on the baryon density. {copyright} {ital 1997} {ital The American Physical Society}

  5. Grand unification and proton stability based on a chiral SU(8) theory

    SciTech Connect

    Deshpande, N.G.; Mannheim, P.D.

    1980-06-01

    A grand-unified model of the strong, electromagnetic, and weak interactions is presented based on a local SU(8)/sub L/ X SU(8)/sub R/ gauge theory that possesses a global U(8)/sub L/ X U(8)/sub R/ invariance. The model is spontaneously broken by the recently introduced neutrino pairing mechanism, in which a Higgs field that transforms like a pair of right-handed neutrinos acquires a vacuum expectation value. This neutrino pairing breaks the model down to the standard Weinberg-Salam phenomenology. Further, the neutrino pairing causes the two initial global currents of the model, fermion number and axial fermion number, to mix with the non-Abelian local currents to leave unbroken two new global currents, namely, baryon number and a particular lepton number that counts charged leptons and left-handed neutrinos only. The exact conservations of these two resulting currents ensure the absolute stability of the proton, the masslessness of the observed left-handed neutrinos, and the standard lepton number conservation of the usual weak interactions. A further feature of the model is the simultaneous absence of both strong CP violations and of observable axions. The model has a testable prediction, namely, the existence of an absolutely stable, relatively light, massive neutral lepton generated entirely from the right-handed neutrino sector of the theory. 1 table.

  6. Quantitative characterization of advanced porous ceramics based on a probabilistic theory of ultrasonic wave propagation

    NASA Astrophysics Data System (ADS)

    Fan, Qiulin; Takatsubo, Junji; Yamamoto, Shigeyuki

    1999-10-01

    A straightforward nondestructive method based on the probabilistic theory of ultrasonic wave propagation [JSME Int. J., Ser. A, Mech. Mater. Eng. 39, 266 (1996)] was developed to quantitatively evaluate porosities, pore shapes, and pore sizes in advanced porous ceramics merely by measuring the ultrasonic delay time and pulse width. The extensive ultrasonic measurements and image microanalyses were conducted in advanced porous alumina, sialon, and zirconia with different porosities. A universal equation was established for porous ceramics, clarifying the intrinsic relationships between ultrasonic characteristics (propagation time and pulse width) and pore distribution (porosity, pore shape, and pore size). The critical volume fraction porosity were estimated separately as approximately 0.06, 0.11, and 0.10 in these ceramics using image microanalysis techniques, at which the transition from the continuous to discontinuous pore phase takes place during sintering. An excellent agreement of two useful corollaries with the data on the above nondestructive and destructive examinations validates the quantitative applicability of the probabilistic theory to pore characterization of advanced ceramics, metals, and their combinations.

  7. Informational Theory of Aging: The Life Extension Method Based on the Bone Marrow Transplantation.

    PubMed

    Karnaukhov, Alexey V; Karnaukhova, Elena V; Sergievich, Larisa A; Karnaukhova, Natalia A; Bogdanenko, Elena V; Manokhina, Irina A; Karnaukhov, Valery N

    2015-01-01

    The method of lifespan extension that is a practical application of the informational theory of aging is proposed. In this theory, the degradation (error accumulation) of the genetic information in cells is considered a main cause of aging. According to it, our method is based on the transplantation of genetically identical (or similar) stem cells with the lower number of genomic errors to the old recipients. For humans and large mammals, this method can be realized by cryopreservation of their own stem cells, taken in a young age, for the later autologous transplantation in old age. To test this method experimentally, we chose laboratory animals of relatively short lifespan (mouse). Because it is difficult to isolate the required amount of the stem cells (e.g., bone marrow) without significant damage for animals, we used the bone marrow transplantation from sacrificed inbred young donors. It is shown that the lifespan extension of recipients depends on level of their genetic similarity (syngeneity) with donors. We have achieved the lifespan increase of the experimental mice by 34% when the transplantation of the bone marrow with high level of genetic similarity was used. PMID:26491435

  8. Cartographic generalization of urban street networks based on gravitational field theory

    NASA Astrophysics Data System (ADS)

    Liu, Gang; Li, Yongshu; Li, Zheng; Guo, Jiawei

    2014-05-01

    The automatic generalization of urban street networks is a constant and important aspect of geographical information science. Previous studies show that the dual graph for street-street relationships more accurately reflects the overall morphological properties and importance of streets than do other methods. In this study, we construct a dual graph to represent street-street relationship and propose an approach to generalize street networks based on gravitational field theory. We retain the global structural properties and topological connectivity of an original street network and borrow from gravitational field theory to define the gravitational force between nodes. The concept of multi-order neighbors is introduced and the gravitational force is taken as the measure of the importance contribution between nodes. The importance of a node is defined as the result of the interaction between a given node and its multi-order neighbors. Degree distribution is used to evaluate the level of maintaining the global structure and topological characteristics of a street network and to illustrate the efficiency of the suggested method. Experimental results indicate that the proposed approach can be used in generalizing street networks and retaining their density characteristics, connectivity and global structure.

  9. Evaluation of a social cognitive theory-based yoga intervention to reduce anxiety.

    PubMed

    Mehta, Purvi; Sharma, Manoj

    Yoga is often viewed as a form of alternative and complementary medicine, as it strives to achieve equilibrium between the body and mind that aids healing. Studies have shown the beneficial role of yoga in anxiety reduction. The purpose of this study was to design and evaluate a 10-week social cognitive theory based yoga intervention to reduce anxiety. The yoga intervention utilized the constructs of behavioral capability, expectations, self-efficacy for yoga from social cognitive theory, and included asanas (postures), pranayama (breathing techniques), shava asana (relaxation), and dhyana (meditation). A one-between and one-within group, quasi-experimental design was utilized for evaluation. Scales measuring expectations from yoga, self-efficacy for yoga, and Speilberger's State Trait Anxiety Inventory, were administered before and after the intervention. Repeated measures analyses of variance (ANOVA) were performed to compare pre-test and post-test scores in the two groups. Yoga as an approach shows promising results for anxiety reduction. PMID:23353562

  10. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    SciTech Connect

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elastic wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.

  11. General Formalism of Decision Making Based on Theory of Open Quantum Systems

    NASA Astrophysics Data System (ADS)

    Asano, M.; Ohya, M.; Basieva, I.; Khrennikov, A.

    2013-01-01

    We present the general formalism of decision making which is based on the theory of open quantum systems. A person (decision maker), say Alice, is considered as a quantum-like system, i.e., a system which information processing follows the laws of quantum information theory. To make decision, Alice interacts with a huge mental bath. Depending on context of decision making this bath can include her social environment, mass media (TV, newspapers, INTERNET), and memory. Dynamics of an ensemble of such Alices is described by Gorini-Kossakowski-Sudarshan-Lindblad (GKSL) equation. We speculate that in the processes of evolution biosystems (especially human beings) designed such "mental Hamiltonians" and GKSL-operators that any solution of the corresponding GKSL-equation stabilizes to a diagonal density operator (In the basis of decision making.) This limiting density operator describes population in which all superpositions of possible decisions has already been resolved. In principle, this approach can be used for the prediction of the distribution of possible decisions in human populations.

  12. A system model for ultrasonic NDT based on the Physical Theory of Diffraction (PTD).

    PubMed

    Darmon, M; Dorval, V; Kamta Djakou, A; Fradkin, L; Chatillon, S

    2016-01-01

    Simulation of ultrasonic Non Destructive Testing (NDT) is helpful for evaluating performances of inspection techniques and requires the modelling of waves scattered by defects. Two classical flaw scattering models have been previously usually employed and evaluated to deal with inspection of planar defects, the Kirchhoff approximation (KA) for simulating reflection and the Geometrical Theory of Diffraction (GTD) for simulating diffraction. Combining them so as to retain advantages of both, the Physical Theory of Diffraction (PTD) initially developed in electromagnetism has been recently extended to elastodynamics. In this paper a PTD-based system model is proposed for simulating the ultrasonic response of crack-like defects. It is also extended to provide good description of regions surrounding critical rays where the shear diffracted waves and head waves interfere. Both numerical and experimental validation of the PTD model is carried out in various practical NDT configurations, such as pulse echo and Time of Flight Diffraction (TOFD), involving both crack tip and corner echoes. Numerical validation involves comparison of this model with KA and GTD as well as the Finite-Element Method (FEM). PMID:26323548

  13. Where are family theories in family-based obesity treatment?: conceptualizing the study of families in pediatric weight management

    PubMed Central

    Skelton, JA; Buehler, C; Irby, MB; Grzywacz, JG

    2014-01-01

    Family-based approaches to pediatric obesity treatment are considered the ‘gold-standard,’ and are recommended for facilitating behavior change to improve child weight status and health. If family-based approaches are to be truly rooted in the family, clinicians and researchers must consider family process and function in designing effective interventions. To bring a better understanding of family complexities to family-based treatment, two relevant reviews were conducted and are presented: (1) a review of prominent and established theories of the family that may provide a more comprehensive and in-depth approach for addressing pediatric obesity; and (2) a systematic review of the literature to identify the use of prominent family theories in pediatric obesity research, which found little use of theories in intervention studies. Overlapping concepts across theories include: families are a system, with interdependence of units; the idea that families are goal-directed and seek balance; and the physical and social environment imposes demands on families. Family-focused theories provide valuable insight into the complexities of families. Increased use of these theories in both research and practice may identify key leverage points in family process and function to prevent the development of or more effectively treat obesity. The field of family studies provides an innovative approach to the difficult problem of pediatric obesity, building on the long-established approach of family-based treatment. PMID:22531090

  14. Semianalytical estimation of the four-wave mixing noise based on extreme value theory.

    PubMed

    Neokosmidis, Ioannis; Marakis, Stylianos; Varoutas, Dimitris

    2013-10-01

    Four-wave mixing (FWM) is one the limiting factors for existing and future wavelength division multiplexed optical networks. A semianalytical method based on Monte Carlo and Extreme Value theory is proposed and applied to study the influence of the FWM noise on the performance of WDM systems. The statistical behavior of the FWM noise is investigated while the Bit-Error rate is calculated for various combinations of the design parameters and for both single and multiple span WDM systems. The semianalytical method is also compared to the Multicanonical Monte Carlo (MCMC) method showing the same efficiency and accuracy with the former providing however the advantage of deriving closed-form approximations for the cumulative distribution functions of the photocurrents in the mark and the space state and the BER. PMID:24104223

  15. Finite element analysis of CNTs based on nonlocal elasticity and Timoshenko beam theory including thermal effect

    NASA Astrophysics Data System (ADS)

    Pradhan, S. C.; Mandal, U.

    2013-09-01

    In the present paper, to analyse the vibration, buckling and bending of carbon nanotubes computations are carried out employing finite element formulation. Present analysis includes nonlocal Timoshenko beam theory and sensitivity of thermal environment. Governing differential equations are derived for the analyses of structural response of carbon nanotubes. Employing MATLAB code the linear algebraic equations are solved. Structural responses of carbon nanotubes are studied. Convergence study of the finite element formulation has been performed. Thermal effects on carbon nanotubes responses have been investigated based on nonlocal and thermal elasticity. Influences of (i) carbon nanotubes geometrical properties like length, diameter and thickness, (ii) nonlocal parameter and (iii) environmental temperature change on the structural response of the carbon nanotubes are studied.

  16. Mechanisms of ultrasonic modulation of multiply scattered incoherent light based on diffusion theory

    NASA Astrophysics Data System (ADS)

    Zhu, Li-Li; Li, Hui

    2015-01-01

    An analytic equation interpreting the intensity of ultrasound-modulated scattering light is derived, based on diffusion theory and previous explanations of the intensity modulation mechanism. Furthermore, an experiment of ultrasonic modulation of incoherent light in a scattering medium is developed. This analytical model agrees well with experimental results, which confirms the validity of the proposed intensity modulation mechanism. The model supplements the existing research on the ultrasonic modulation mechanism of scattering light. Project supported by the National Natural Science Foundation of China (Grant No. 61178089), the Key Program of Science and Technology of Fujian Province, China (Grant No. 2011Y0019), and the Educational Department of Fujian Province, China (Grant No. JA13074).

  17. Grey Situation Group Decision-Making Method Based on Prospect Theory

    PubMed Central

    Zhang, Na; Fang, Zhigeng; Liu, Xiaqing

    2014-01-01

    This paper puts forward a grey situation group decision-making method on the basis of prospect theory, in view of the grey situation group decision-making problems that decisions are often made by multiple decision experts and those experts have risk preferences. The method takes the positive and negative ideal situation distance as reference points, defines positive and negative prospect value function, and introduces decision experts' risk preference into grey situation decision-making to make the final decision be more in line with decision experts' psychological behavior. Based on TOPSIS method, this paper determines the weight of each decision expert, sets up comprehensive prospect value matrix for decision experts' evaluation, and finally determines the optimal situation. At last, this paper verifies the effectiveness and feasibility of the method by means of a specific example. PMID:25197706

  18. A High Precision Feature Based on LBP and Gabor Theory for Face Recognition

    PubMed Central

    Xia, Wei; Yin, Shouyi; Ouyang, Peng

    2013-01-01

    How to describe an image accurately with the most useful information but at the same time the least useless information is a basic problem in the recognition field. In this paper, a novel and high precision feature called BG2D2LRP is proposed, accompanied with a corresponding face recognition system. The feature contains both static texture differences and dynamic contour trends. It is based on Gabor and LBP theory, operated by various kinds of transformations such as block, second derivative, direct orientation, layer and finally fusion in a particular way. Seven well-known face databases such as FRGC, AR, FERET and so on are used to evaluate the veracity and robustness of the proposed feature. A maximum improvement of 29.41% is achieved comparing with other methods. Besides, the ROC curve provides a satisfactory figure. Those experimental results strongly demonstrate the feasibility and superiority of the new feature and method. PMID:23552103

  19. A review of return-stroke models based on transmission line theory

    NASA Astrophysics Data System (ADS)

    De Conti, Alberto; Silveira, Fernando H.; Visacro, Silvério; Cardoso, Thiago C. M.

    2015-12-01

    This paper presents a review of lightning return-stroke models based on transmission line theory. The reviewed models are classified in three different categories, namely discharge-type, lumped-excitation, and parameter-estimation models. An attempt is made to address the difficulties that some models experience in reproducing directly or indirectly observable features of lightning, such as current characteristics and remote electromagnetic fields. It is argued that most of these difficulties are related to a poor discretization of the lightning channel, to inconsistencies in the calculation of per-unit-length channel parameters, to uncertainties in the representation of the upper end of the channel, and to assuming an ideal switch to connect the channel to ground in the transition from leader to return stroke. Applications of transmission line return-stroke models are also outlined.

  20. Detection and control of combustion instability based on the concept of dynamical system theory

    NASA Astrophysics Data System (ADS)

    Gotoda, Hiroshi; Shinoda, Yuta; Kobayashi, Masaki; Okuno, Yuta; Tachibana, Shigeru

    2014-02-01

    We propose an online method of detecting combustion instability based on the concept of dynamical system theory, including the characterization of the dynamic behavior of combustion instability. As an important case study relevant to combustion instability encountered in fundamental and practical combustion systems, we deal with the combustion dynamics close to lean blowout (LBO) in a premixed gas-turbine model combustor. The relatively regular pressure fluctuations generated by thermoacoustic oscillations transit to low-dimensional intermittent chaos owing to the intermittent appearance of burst with decreasing equivalence ratio. The translation error, which is characterized by quantifying the degree of parallelism of trajectories in the phase space, can be used as a control variable to prevent LBO.