Science.gov

Sample records for queuing theory based

  1. Queuing Theory and Reference Transactions.

    ERIC Educational Resources Information Center

    Terbille, Charles

    1995-01-01

    Examines the implications of applying the queuing theory to three different reference situations: (1) random patron arrivals; (2) random durations of transactions; and (3) use of two librarians. Tables and figures represent results from spreadsheet calculations of queues for each reference situation. (JMV)

  2. An application of queuing theory to waterfowl migration

    USGS Publications Warehouse

    Sojda, Richard S.; Cornely, John E.; Fredrickson, Leigh H.

    2002-01-01

    There has always been great interest in the migration of waterfowl and other birds. We have applied queuing theory to modelling waterfowl migration, beginning with a prototype system for the Rocky Mountain Population of trumpeter swans (Cygnus buccinator) in Western North America. The queuing model can be classified as a D/BB/28 system, and we describe the input sources, service mechanism, and network configuration of queues and servers. The intrinsic nature of queuing theory is to represent the spatial and temporal characteristics of entities and how they move, are placed in queues, and are serviced. The service mechanism in our system is an algorithm representing how swans move through the flyway based on seasonal life cycle events. The system uses an observed number of swans at each of 27 areas for a breeding season as input and simulates their distribution through four seasonal steps. The result is a simulated distribution of birds for the subsequent year's breeding season. The model was built as a multiagent system with one agent handling movement algorithms, with one facilitating user interface, and with one to seven agents representing specific geographic areas for which swan management interventions can be implemented. The many parallels in queuing model servers and service mechanisms with waterfowl management areas and annual life cycle events made the transfer of the theory to practical application straightforward.

  3. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  4. Queuing Theory: An Alternative Approach to Educational Research.

    ERIC Educational Resources Information Center

    Huyvaert, Sarah H.

    Queuing theory is examined in this paper in order to determine if the theory could be applied in educational settings. It is defined as a form of operations research that uses mathematical formulas and/or computer simulation to study wait and congestion in a system and, through the study of these visible phenomena, to discover malfunctions within…

  5. Application of queuing theory in production-inventory optimization

    NASA Astrophysics Data System (ADS)

    Rashid, Reza; Hoseini, Seyed Farzad; Gholamian, M. R.; Feizabadi, Mohammad

    2015-07-01

    This paper presents a mathematical model for an inventory control system in which customers' demands and suppliers' service time are considered as stochastic parameters. The proposed problem is solved through queuing theory for a single item. In this case, transitional probabilities are calculated in steady state. Afterward, the model is extended to the case of multi-item inventory systems. Then, to deal with the complexity of this problem, a new heuristic algorithm is developed. Finally, the presented bi-level inventory-queuing model is implemented as a case study in Electroestil Company.

  6. Application of queuing theory in inventory systems with substitution flexibility

    NASA Astrophysics Data System (ADS)

    Seyedhoseini, S. M.; Rashid, Reza; Kamalpour, Iman; Zangeneh, Erfan

    2015-01-01

    Considering the competition in today's business environment, tactical planning of a supply chain becomes more complex than before. In many multi-product inventory systems, substitution flexibility can improve profits. This paper aims to prepare a comprehensive substitution inventory model, where an inventory system with two substitute products with ignorable lead time has been considered, and effects of simultaneous ordering have been examined. In this paper, demands of customers for both of the products have been regarded as stochastic parameters, and queuing theory has been used to construct a mathematical model. The model has been coded by C++, and it has been analyzed due to a real example, where the results indicate efficiency of proposed model.

  7. Using Queuing Theory and Simulation Model to Optimize Hospital Pharmacy Performance

    PubMed Central

    Bahadori, Mohammadkarim; Mohammadnejhad, Seyed Mohsen; Ravangard, Ramin; Teymourzadeh, Ehsan

    2014-01-01

    Background: Hospital pharmacy is responsible for controlling and monitoring the medication use process and ensures the timely access to safe, effective and economical use of drugs and medicines for patients and hospital staff. Objectives: This study aimed to optimize the management of studied outpatient pharmacy by developing suitable queuing theory and simulation technique. Patients and Methods: A descriptive-analytical study conducted in a military hospital in Iran, Tehran in 2013. A sample of 220 patients referred to the outpatient pharmacy of the hospital in two shifts, morning and evening, was selected to collect the necessary data to determine the arrival rate, service rate, and other data needed to calculate the patients flow and queuing network performance variables. After the initial analysis of collected data using the software SPSS 18, the pharmacy queuing network performance indicators were calculated for both shifts. Then, based on collected data and to provide appropriate solutions, the queuing system of current situation for both shifts was modeled and simulated using the software ARENA 12 and 4 scenarios were explored. Results: Results showed that the queue characteristics of the studied pharmacy during the situation analysis were very undesirable in both morning and evening shifts. The average numbers of patients in the pharmacy were 19.21 and 14.66 in the morning and evening, respectively. The average times spent in the system by clients were 39 minutes in the morning and 35 minutes in the evening. The system utilization in the morning and evening were, respectively, 25% and 21%. The simulation results showed that reducing the staff in the morning from 2 to 1 in the receiving prescriptions stage didn't change the queue performance indicators. Increasing one staff in filling prescription drugs could cause a decrease of 10 persons in the average queue length and 18 minutes and 14 seconds in the average waiting time. On the other hand, simulation results showed that in the evening, decreasing the staff from 2 to 1 in the delivery of prescription drugs, changed the queue performance indicators very little. Increasing a staff to fill prescription drugs could cause a decrease of 5 persons in the average queue length and 8 minutes and 44 seconds in the average waiting time. Conclusions: The patients' waiting times and the number of patients waiting to receive services in both shifts could be reduced by using multitasking persons and reallocating them to the time-consuming stage of filling prescriptions, using queuing theory and simulation techniques. PMID:24829791

  8. An Integrated Model of Patient and Staff Satisfaction Using Queuing Theory

    PubMed Central

    Mousavi, Ali; Clarkson, P. John; Young, Terry

    2015-01-01

    This paper investigates the connection between patient satisfaction, waiting time, staff satisfaction, and service time. It uses a variety of models to enable improvement against experiential and operational health service goals. Patient satisfaction levels are estimated using a model based on waiting (waiting times). Staff satisfaction levels are estimated using a model based on the time spent with patients (service time). An integrated model of patient and staff satisfaction, the effective satisfaction level model, is then proposed (using queuing theory). This links patient satisfaction, waiting time, staff satisfaction, and service time, connecting two important concepts, namely, experience and efficiency in care delivery and leading to a more holistic approach in designing and managing health services. The proposed model will enable healthcare systems analysts to objectively and directly relate elements of service quality to capacity planning. Moreover, as an instrument used jointly by healthcare commissioners and providers, it affords the prospect of better resource allocation. PMID:27170899

  9. Allocation of computer ports within a terminal switching network: an application of queuing theory to gandalf port contenders

    SciTech Connect

    Vahle, M.O.

    1982-03-01

    Queuing theory is applied to the problem of assigning computer ports within a terminal switching network to maximize the likelihood of instant connect. A brief background of the network is included to focus on the statement of the problem.

  10. Application of queuing theory to patient satisfaction at a tertiary hospital in Nigeria

    PubMed Central

    Ameh, Nkeiruka; Sabo, B.; Oyefabi, M. O.

    2013-01-01

    Background: Queuing theory is the mathematical approach to the analysis of waiting lines in any setting where arrival rate of subjects is faster than the system can handle. It is applicable to healthcare settings where the systems have excess capacity to accommodate random variations. Materials and Methods: A cross-sectional descriptive survey was done. Questionnaires were administered to patients who attended the general outpatient department. Observations were also made on the queuing model and the service discipline at the clinic. Questions were meant to obtain demographic characteristics and the time spent on the queue by patients before being seen by a doctor, time spent with the doctor, their views about the time spent on the queue and useful suggestions on how to reduce the time spent on the queue. A total of 210 patients were surveyed. Results: Majority of the patients (164, 78.1%) spent 2 h or less on the queue before being seen by a doctor and less than 1 h to see the doctor. Majority of the patients (144, 68.5%) were satisfied with the time they spent on the queue before being seen by a doctor. Useful suggestions proffered by the patients to decrease the time spent on the queue before seeing a doctor at the clinic included: that more doctors be employed (46, 21.9%), that doctors should come to work on time (25, 11.9%), that first-come-first served be observed strictly (32, 15.2%) and others suggested that the records staff should desist from collecting bribes from patients in order to place their cards before others. The queuing method employed at the clinic is the multiple single channel type and the service discipline is priority service. The patients who spent less time on the queue (<1 h) before seeing the doctor were more satisfied than those who spent more time (P < 0.05). Conclusion: The study has revealed that majority of the patients were satisfied with the practice at the general outpatient department. However, there is a need to employ measures to respond to the suggestions given by the patients who are the beneficiaries of the hospital services. PMID:23661902

  11. Spreadsheet Analysis Of Queuing In A Computer Network

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1992-01-01

    Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.

  12. Effects of diversity and procrastination in priority queuing theory: The different power law regimes

    NASA Astrophysics Data System (ADS)

    Saichev, A.; Sornette, D.

    2010-01-01

    Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law 1/t? with 0queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a time deficit control parameter ? and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail 1/t1/2 , resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/t? , with ??(0.5,?) , including the well-known case 1/t . We also study the effect of procrastination, defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence.

  13. Effects of diversity and procrastination in priority queuing theory: the different power law regimes.

    PubMed

    Saichev, A; Sornette, D

    2010-01-01

    Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law approximately 1/t(alpha) with 0queuing theory, of a target task which has the lowest priority compared to all other tasks that flow on the computer of an individual. We identify a "time deficit" control parameter beta and a bifurcation to a regime where there is a nonzero probability for the target task to never be completed. The distribution of waiting time T until the completion of the target task has the power law tail approximately 1/t(1/2), resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into 1/t(alpha), with alpha is an element of (0.5,infinity), including the well-known case 1/t. We also study the effect of "procrastination," defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This regime provides an explanation for even slower apparent decay and longer persistence. PMID:20365433

  14. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    SciTech Connect

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.

    2015-03-10

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.

  15. Improving queuing service at McDonald's

    NASA Astrophysics Data System (ADS)

    Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.

    2014-07-01

    Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.

  16. Using Queuing Theory and Simulation Modelling to Reduce Waiting Times in An Iranian Emergency Department

    PubMed Central

    Haghighinejad, Hourvash Akbari; Kharazmi, Erfan; Hatam, Nahid; Yousefi, Sedigheh; Hesami, Seyed Ali; Danaei, Mina; Askarian, Mehrdad

    2016-01-01

    Background: Hospital emergencies have an essential role in health care systems. In the last decade, developed countries have paid great attention to overcrowding crisis in emergency departments. Simulation analysis of complex models for which conditions will change over time is much more effective than analytical solutions and emergency department (ED) is one of the most complex models for analysis. This study aimed to determine the number of patients who are waiting and waiting time in emergency department services in an Iranian hospital ED and to propose scenarios to reduce its queue and waiting time. Methods: This is a cross-sectional study in which simulation software (Arena, version 14) was used. The input information was extracted from the hospital database as well as through sampling. The objective was to evaluate the response variables of waiting time, number waiting and utilization of each server and test the three scenarios to improve them. Results: Running the models for 30 days revealed that a total of 4088 patients left the ED after being served and 1238 patients waited in the queue for admission in the ED bed area at end of the run (actually these patients received services out of their defined capacity). The first scenario result in the number of beds had to be increased from 81 to179 in order that the number waiting of the “bed area” server become almost zero. The second scenario which attempted to limit hospitalization time in the ED bed area to the third quartile of the serving time distribution could decrease the number waiting to 586 patients. Conclusion: Doubling the bed capacity in the emergency department and consequently other resources and capacity appropriately can solve the problem. This includes bed capacity requirement for both critically ill and less critically ill patients. Classification of ED internal sections based on severity of illness instead of medical specialty is another solution. PMID:26793727

  17. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    PubMed Central

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  18. Design and Implementation of High-Speed Input-Queued Switches Based on a Fair Scheduling Algorithm

    NASA Astrophysics Data System (ADS)

    Hu, Qingsheng; Zhao, Hua-An

    To increase both the capacity and the processing speed for input-queued (IQ) switches, we proposed a fair scalable scheduling architecture (FSSA). By employing FSSA comprised of several cascaded sub-schedulers, a large-scale high performance switches or routers can be realized without the capacity limitation of monolithic device. In this paper, we present a fair scheduling algorithm named FSSA_DI based on an improved FSSA where a distributed iteration scheme is employed, the scheduler performance can be improved and the processing time can be reduced as well. Simulation results show that FSSA_DI achieves better performance on average delay and throughput under heavy loads compared to other existing algorithms. Moreover, a practical 64 × 64 FSSA using FSSA_DI algorithm is implemented by four Xilinx Vertex-4 FPGAs. Measurement results show that the data rates of our solution can be up to 800Mbps and the tradeoff between performance and hardware complexity has been solved peacefully.

  19. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    PubMed

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  20. Queuing Up

    SciTech Connect

    Miller, Nicholas; Zavadil, Robert; Ellis, Abraham; Muljadi, Eduard; Camm, Ernst; Kirby, Brendan J

    2007-01-01

    The knowledge base of the electric power system engineering community continues to grow with installed capacity of wind generation in North America. While this process has certainly occurred at other times in the industry with other technologies, the relatively explosive growth, the compressed time frames from project conception to commissioning, and the unconventional characteristics of wind generation make this period in the industry somewhat unique. Large wind generation facilities are necessarily evolving to look more and more like conventional generating plants in terms of their ability to interact with the transmission network in a way that does not compromise performance or system reliability. Such an evolution has only been possible through the cumulative contributions of an ever-growing number of power system engineers who have delved into the unique technologies and technical challenges presented by wind generation. The industry is still only part of the way up the learning curve, however. Numerous technical challenges remain, and as has been found, each new wind generation facility has the potential to generate some new questions. With the IEEE PES expanding its presence and activities in this increasingly significant commercial arena, the prospects for staying "ahead of the curve" are brightened.

  1. Human Factors of Queuing: A Library Circulation Model.

    ERIC Educational Resources Information Center

    Mansfield, Jerry W.

    1981-01-01

    Classical queuing theories and their accompanying service facilities totally disregard the human factors in the name of efficiency. As library managers we need to be more responsive to human needs in the design of service points and make every effort to minimize queuing and queue frustration. Five references are listed. (Author/RAA)

  2. A queuing model for road traffic simulation

    SciTech Connect

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-03-10

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.

  3. Development of Markov Chain-Based Queuing Model and Wireless Infrastructure for EV to Smart Meter Communication in V2G

    NASA Astrophysics Data System (ADS)

    Santoshkumar; Udaykumar, R. Y.

    2015-04-01

    The electrical vehicles (EVs) can be connected to the grid for power transaction. The vehicle-to-grid (V2G) supports the grid requirements and helps in maintaining the load demands. The grid control center (GCC), aggregator and EV are three key entities in V2G communication. The GCC sends the information about power requirements to the aggregator. The aggregator after receiving the information from the GCC sends the information to the EVs. Based on the information, the interested EV owners participate in power transaction with the grid. The aggregator facilitates the EVs by providing the parking and charging slot. In this paper the queuing model for EVs connected to the grid and development of wireless infrastructure for the EV to Smart Meter communication is proposed. The queuing model is developed and simulated. The path loss models for WiMAX are analyzed and compared. Also, the physical layer of WiMAX protocol is modeled and simulated for the EV to Smart Meter communication in V2G.

  4. Capacity utilization study for aviation security cargo inspection queuing system

    NASA Astrophysics Data System (ADS)

    Allgood, Glenn O.; Olama, Mohammed M.; Lake, Joe E.; Brumback, Daryl

    2010-04-01

    In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system's ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.

  5. Capacity Utilization Study for Aviation Security Cargo Inspection Queuing System

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E; Brumback, Daryl L

    2010-01-01

    In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The queuing model employed in our study is based on discrete-event simulation and processes various types of cargo simultaneously. Onsite measurements are collected in an airport facility to validate the queuing model. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, throughput, capacity utilization, subscribed capacity utilization, resources capacity utilization, subscribed resources capacity utilization, and number of cargo pieces (or pallets) in the different queues. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. We studied and analyzed different scenarios by changing various model parameters such as number of pieces per pallet, number of TSA inspectors and ATS personnel, number of forklifts, number of explosives trace detection (ETD) and explosives detection system (EDS) inspection machines, inspection modality distribution, alarm rate, and cargo closeout time. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures should reduce the overall cost and shipping delays associated with new inspection requirements.

  6. Network Queuing System, Version 2.0

    NASA Technical Reports Server (NTRS)

    Walter, Howard; Bridges, Mike; Carver, Terrie; Kingsbury, Brent

    1993-01-01

    Network Queuing System (NQS) computer program is versatile batch- and device-queuing facility for single UNIX computer or group of computers in network. User invokes NQS collection of user-space programs to move batch and device jobs freely among different computers in network. Provides facilities for remote queuing, request routing, remote status, queue-status controls, batch-request resource quota limits, and remote output return. Revision of NQS provides for creation, deletion, addition, and setting of complexes aiding in limiting number of requests handled at one time. Also has improved device-oriented queues along with some revision of displays. Written in C language.

  7. Modeling and simulation of M/M/c queuing pharmacy system with adjustable parameters

    NASA Astrophysics Data System (ADS)

    Rashida, A. R.; Fadzli, Mohammad; Ibrahim, Safwati; Goh, Siti Rohana

    2016-02-01

    This paper studies a discrete event simulation (DES) as a computer based modelling that imitates a real system of pharmacy unit. M/M/c queuing theo is used to model and analyse the characteristic of queuing system at the pharmacy unit of Hospital Tuanku Fauziah, Kangar in Perlis, Malaysia. The input of this model is based on statistical data collected for 20 working days in June 2014. Currently, patient waiting time of pharmacy unit is more than 15 minutes. The actual operation of the pharmacy unit is a mixed queuing server with M/M/2 queuing model where the pharmacist is referred as the server parameters. DES approach and ProModel simulation software is used to simulate the queuing model and to propose the improvement for queuing system at this pharmacy system. Waiting time for each server is analysed and found out that Counter 3 and 4 has the highest waiting time which is 16.98 and 16.73 minutes. Three scenarios; M/M/3, M/M/4 and M/M/5 are simulated and waiting time for actual queuing model and experimental queuing model are compared. The simulation results show that by adding the server (pharmacist), it will reduce patient waiting time to a reasonable improvement. Almost 50% average patient waiting time is reduced when one pharmacist is added to the counter. However, it is not necessary to fully utilize all counters because eventhough M/M/4 and M/M/5 produced more reduction in patient waiting time, but it is ineffective since Counter 5 is rarely used.

  8. Application of queuing model in Dubai's busiest megaplex

    NASA Astrophysics Data System (ADS)

    Bhagchandani, Maneesha; Bajpai, Priti

    2013-09-01

    This paper provides a study and analysis of the extremely busy booking counters at the Megaplex in Dubai using the queuing model and simulation. Dubai is an emirate in UAE with a multicultural population. Majority of the population in Dubai is foreign born. Cinema is one of the major forms of entertainment. There are more than 13 megaplexes each with a number of screens ranging from 3 to 22. They screen movies in English, Arabic, Hindi and other languages. It has been observed that during the weekends megaplexes attract a large number of crowd resulting in long queues at the booking counters. One of the busiest megaplex was selected for the study. Queuing theory satisfies the model when tested in real time situation. The concepts of arrival rate, service rate, utilization rate, waiting time in the system, average number of people in the queue, using Little's Theorem and M/M/s queuing model along with simulation software have been used to suggest an empirical solution. The aim of the paper is twofold-To assess the present situation at the Megaplex and give recommendations to optimize the use of booking counters.

  9. Queuing register uses fluid logic elements

    NASA Technical Reports Server (NTRS)

    1966-01-01

    Queuing register /a multistage bit-shifting device/ uses a series of pure fluid elements to perform the required logic operations. The register has several stages of three-state pure fluid elements combined with two-input NOR gates.

  10. Queuing Models of Tertiary Storage

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1996-01-01

    Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/

  11. Modeling Patient Flows Using a Queuing Network with Blocking

    PubMed Central

    KUNO, ERI; SMITH, TONY E.

    2015-01-01

    The downsizing and closing of state mental health institutions in Philadelphia in the 1990’s led to the development of a continuum care network of residential-based services. Although the diversity of care settings increased, congestion in facilities caused many patients to unnecessarily spend extra days in intensive facilities. This study applies a queuing network system with blocking to analyze such congestion processes. “Blocking” denotes situations where patients are turned away from accommodations to which they are referred, and are thus forced to remain in their present facilities until space becomes available. Both mathematical and simulation results are presented and compared. Although queuing models have been used in numerous healthcare studies, the inclusion of blocking is still rare. We found that, in Philadelphia, the shortage of a particular type of facilities may have created “upstream blocking”. Thus removal of such facility-specific bottlenecks may be the most efficient way to reduce congestion in the system as a whole. PMID:15782512

  12. Some queuing network models of computer systems

    NASA Technical Reports Server (NTRS)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  13. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  14. Theory-Based Stakeholder Evaluation

    ERIC Educational Resources Information Center

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary

  15. Queuing network approach for building evacuation planning

    NASA Astrophysics Data System (ADS)

    Ishak, Nurhanis; Khalid, Ruzelan; Baten, Md. Azizul; Nawawi, Mohd. Kamal Mohd.

    2014-12-01

    The complex behavior of pedestrians in a limited space layout can explicitly be modeled using an M/G/C/C state dependent queuing network. This paper implements the approach to study pedestrian flows through various corridors in a topological network. The best arrival rates and their impacts to the corridors' performances in terms of the throughput, blocking probability, expected number of occupants in the system and expected travel time were first measured using the M/G/C/C analytical model. These best arrival rates were then fed to its Network Flow Programming model to find the best arrival rates to source corridors and routes optimizing the network's total throughput. The analytical results were then validated using a simulation model. Various results of this study can be used to support the current Standard Operating Procedures (SOP) to efficiently and safely evacuate people in emergency cases.

  16. Agent-Based Literacy Theory

    ERIC Educational Resources Information Center

    McEneaney, John E.

    2006-01-01

    The purpose of this theoretical essay is to explore the limits of traditional conceptualizations of reader and text and to propose a more general theory based on the concept of a literacy agent. The proposed theoretical perspective subsumes concepts from traditional theory and aims to account for literacy online. The agent-based literacy theory

  17. The Queued Service Observing Project at CFHT

    NASA Astrophysics Data System (ADS)

    Martin, Pierre; Savalle, Renaud; Vermeulen, Tom; Shapiro, Joshua N.

    2002-12-01

    In order to maximize the scientific productivity of the CFH12K mosaic wide-field imager (and soon MegaCam), the Queued Service Observing (QSO) mode was implemented at the Canada-France-Hawaii Telescope at the beginning of 2001. The QSO system consists of an ensemble of software components allowing for the submission of programs, the preparation of queues, and finally the execution and evaluation of observations. The QSO project is part of a broader system known as the New Observing Process (NOP). This system includes data acquisition, data reduction and analysis through a pipeline named Elixir, and a data archiving and distribution component (DADS). In this paper, we review several technical and operational aspects of the QSO project. In particular, we present our strategy, technical architecture, program submission system, and the tools developed for the preparation and execution of the queues. Our successful experience of over 150 nights of QSO operations is also discussed along with the future plans for queue observing with MegaCam and other instruments at CFHT.

  18. NQS - NETWORK QUEUING SYSTEM, VERSION 2.0 (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Walter, H.

    1994-01-01

    The Network Queuing System, NQS, is a versatile batch and device queuing facility for a single Unix computer or a group of networked computers. With the Unix operating system as a common interface, the user can invoke the NQS collection of user-space programs to move batch and device jobs freely around the different computer hardware tied into the network. NQS provides facilities for remote queuing, request routing, remote status, queue status controls, batch request resource quota limits, and remote output return. This program was developed as part of an effort aimed at tying together diverse UNIX based machines into NASA's Numerical Aerodynamic Simulator Processing System Network. This revision of NQS allows for creating, deleting, adding and setting of complexes that aid in limiting the number of requests to be handled at one time. It also has improved device-oriented queues along with some revision of the displays. NQS was designed to meet the following goals: 1) Provide for the full support of both batch and device requests. 2) Support all of the resource quotas enforceable by the underlying UNIX kernel implementation that are relevant to any particular batch request and its corresponding batch queue. 3) Support remote queuing and routing of batch and device requests throughout the NQS network. 4) Support queue access restrictions through user and group access lists for all queues. 5) Enable networked output return of both output and error files to possibly remote machines. 6) Allow mapping of accounts across machine boundaries. 7) Provide friendly configuration and modification mechanisms for each installation. 8) Support status operations across the network, without requiring a user to log in on remote target machines. 9) Provide for file staging or copying of files for movement to the actual execution machine. To support batch and device requests, NQS v.2 implements three queue types--batch, device and pipe. Batch queues hold and prioritize batch requests; device queues hold and prioritize device requests; pipe queues transport both batch and device requests to other batch, device, or pipe queues at local or remote machines. Unique to batch queues are resource quota limits that restrict the amounts of different resources that a batch request can consume during execution. Unique to each device queue is a set of one or more devices, such as a line printer, to which requests can be sent for execution. Pipe queues have associated destinations to which they route and deliver requests. If the proper destination machine is down or unreachable, pipe queues are able to requeue the request and deliver it later when the destination is available. All NQS network conversations are performed using the Berkeley socket mechanism as ported into the respective vendor kernels. NQS is written in C language. The generic UNIX version (ARC-13179) has been successfully implemented on a variety of UNIX platforms, including Sun3 and Sun4 series computers, SGI IRIS computers running IRIX 3.3, DEC computers running ULTRIX 4.1, AMDAHL computers running UTS 1.3 and 2.1, platforms running BSD 4.3 UNIX. The IBM RS/6000 AIX version (COS-10042) is a vendor port. NQS 2.0 will also communicate with the Cray Research, Inc. and Convex, Inc. versions of NQS. The standard distribution medium for either machine version of NQS 2.0 is a 60Mb, QIC-24, .25 inch streaming magnetic tape cartridge in UNIX tar format. Upon request the generic UNIX version (ARC-13179) can be provided in UNIX tar format on alternate media. Please contact COSMIC to discuss the availability and cost of media to meet your specific needs. An electronic copy of the NQS 2.0 documentation is included on the program media. NQS 2.0 was released in 1991. The IBM RS/6000 port of NQS was developed in 1992. IRIX is a trademark of Silicon Graphics Inc. IRIS is a registered trademark of Silicon Graphics Inc. UNIX is a registered trademark of UNIX System Laboratories Inc. Sun3 and Sun4 are trademarks of Sun Microsystems Inc. DEC and ULTRIX are trademarks of Digital Equipment Corporation.

  19. Time-varying priority queuing models for human dynamics

    NASA Astrophysics Data System (ADS)

    Jo, Hang-Hyun; Pan, Raj Kumar; Kaski, Kimmo

    2012-06-01

    Queuing models provide insight into the temporal inhomogeneity of human dynamics, characterized by the broad distribution of waiting times of individuals performing tasks. We theoretically study the queuing model of an agent trying to execute a task of interest, the priority of which may vary with time due to the agent's “state of mind.” However, its execution is disrupted by other tasks of random priorities. By considering the priority of the task of interest either decreasing or increasing algebraically in time, we analytically obtain and numerically confirm the bimodal and unimodal waiting time distributions with power-law decaying tails, respectively. These results are also compared to the updating time distribution of papers in arXiv.org and the processing time distribution of papers in Physical Review journals. Our analysis helps to understand human task execution in a more realistic scenario.

  20. Time-varying priority queuing models for human dynamics.

    PubMed

    Jo, Hang-Hyun; Pan, Raj Kumar; Kaski, Kimmo

    2012-06-01

    Queuing models provide insight into the temporal inhomogeneity of human dynamics, characterized by the broad distribution of waiting times of individuals performing tasks. We theoretically study the queuing model of an agent trying to execute a task of interest, the priority of which may vary with time due to the agent's "state of mind." However, its execution is disrupted by other tasks of random priorities. By considering the priority of the task of interest either decreasing or increasing algebraically in time, we analytically obtain and numerically confirm the bimodal and unimodal waiting time distributions with power-law decaying tails, respectively. These results are also compared to the updating time distribution of papers in arXiv.org and the processing time distribution of papers in Physical Review journals. Our analysis helps to understand human task execution in a more realistic scenario. PMID:23005156

  1. An application of a queuing model for sea states

    NASA Astrophysics Data System (ADS)

    Loffredo, L.; Monbaliu, J.; Anderson, C.

    2012-04-01

    Unimodal approaches in design practice have shown inconsistencies in terms of directionality and limitations for accurate sea states description. Spectral multimodality needs to be included in the description of the wave climate. It can provide information about the coexistence of different wave systems originating from different meteorological events, such as locally generated wind waves and swell systems from distant storms. A 20 years dataset (1989-2008) for a location on the North Sea (K13, 53.2°N 3.2°E) has been retrieved from the ECMWF ERA- Interim re-analysis data archive, providing a consistent and homogeneous dataset. The work focuses on the joint and conditional probability distributions of wind sea and swell systems. For marine operations and design applications, critical combinations of wave systems may exist. We define a critical sea state on the basis of a set of thresholds, which can be not necessarily extreme, the emphasis is given to the dangerous combination of different wave systems concerning certain operations (i.e. small vessels navigation, dredging). The distribution of non-operability windows is described by a point process model with random and independent events, whose occurrences and lengths can be described only probabilistically. These characteristics allow to treat the emerging patterns as a part of a queuing system. According to this theory, generally adopted for several applications including traffic flows and waiting lines, the input process describes the sequence of requests for a service and the service mechanism the length of time that these requests will occupy the facilities. For weather-driven processes at sea an alternating renewal process appears as a suitable model. It consists of a sequence of critical events (period of inoperability), each of random duration, separated by calms, also of random durations. Inoperability periods and calms are assumed independent. In this model it is not possible more than one critical event occurring at the same time. The analysis is carried out taking into account the thresholds' selection and the seasonality.

  2. Modified weighted fair queuing for packet scheduling in mobile WiMAX networks

    NASA Astrophysics Data System (ADS)

    Satrya, Gandeva B.; Brotoharsono, Tri

    2013-03-01

    The increase of user mobility and the need for data access anytime also increases the interest in broadband wireless access (BWA). The best available quality of experience for mobile data service users are assured for IEEE 802.16e based users. The main problem of assuring a high QOS value is how to allocate available resources among users in order to meet the QOS requirement for criteria such as delay, throughput, packet loss and fairness. There is no specific standard scheduling mechanism stated by IEEE standards, which leaves it for implementer differentiation. There are five QOS service classes defined by IEEE 802.16: Unsolicited Grant Scheme (UGS), Extended Real Time Polling Service (ertPS), Real Time Polling Service (rtPS), Non Real Time Polling Service (nrtPS) and Best Effort Service (BE). Each class has different QOS parameter requirements for throughput and delay/jitter constraints. This paper proposes Modified Weighted Fair Queuing (MWFQ) scheduling scenario which was based on Weighted Round Robin (WRR) and Weighted Fair Queuing (WFQ). The performance of MWFQ was assessed by using above five QoS criteria. The simulation shows that using the concept of total packet size calculation improves the network's performance.

  3. Using multi-class queuing network to solve performance models of e-business sites.

    PubMed

    Zheng, Xiao-ying; Chen, De-ren

    2004-01-01

    Due to e-business's variety of customers with different navigational patterns and demands, multi-class queuing network is a natural performance model for it. The open multi-class queuing network(QN) models are based on the assumption that no service center is saturated as a result of the combined loads of all the classes. Several formulas are used to calculate performance measures, including throughput, residence time, queue length, response time and the average number of requests. The solution technique of closed multi-class QN models is an approximate mean value analysis algorithm (MVA) based on three key equations, because the exact algorithm needs huge time and space requirement. As mixed multi-class QN models, include some open and some closed classes, the open classes should be eliminated to create a closed multi-class QN so that the closed model algorithm can be applied. Some corresponding examples are given to show how to apply the algorithms mentioned in this article. These examples indicate that multi-class QN is a reasonably accurate model of e-business and can be solved efficiently. PMID:14663849

  4. Priority queuing models for hospital intensive care units and impacts to severe case patients.

    PubMed

    Hagen, Matthew S; Jopling, Jeffrey K; Buchman, Timothy G; Lee, Eva K

    2013-01-01

    This paper examines several different queuing models for intensive care units (ICU) and the effects on wait times, utilization, return rates, mortalities, and number of patients served. Five separate intensive care units at an urban hospital are analyzed and distributions are fitted for arrivals and service durations. A system-based simulation model is built to capture all possible cases of patient flow after ICU admission. These include mortalities and returns before and after hospital exits. Patients are grouped into 9 different classes that are categorized by severity and length of stay (LOS). Each queuing model varies by the policies that are permitted and by the order the patients are admitted. The first set of models does not prioritize patients, but examines the advantages of smoothing the operating schedule for elective surgeries. The second set analyzes the differences between prioritizing admissions by expected LOS or patient severity. The last set permits early ICU discharges and conservative and aggressive bumping policies are contrasted. It was found that prioritizing patients by severity considerably reduced delays for critical cases, but also increased the average waiting time for all patients. Aggressive bumping significantly raised the return and mortality rates, but more conservative methods balance quality and efficiency with lowered wait times without serious consequences. PMID:24551379

  5. NAS Requirements Checklist for Job Queuing/Scheduling Software

    NASA Technical Reports Server (NTRS)

    Jones, James Patton

    1996-01-01

    The increasing reliability of parallel systems and clusters of computers has resulted in these systems becoming more attractive for true production workloads. Today, the primary obstacle to production use of clusters of computers is the lack of a functional and robust Job Management System for parallel applications. This document provides a checklist of NAS requirements for job queuing and scheduling in order to make most efficient use of parallel systems and clusters for parallel applications. Future requirements are also identified to assist software vendors with design planning.

  6. A message-queuing framework for STAR's online monitoring and metadata collection

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.; Betts, W.

    2011-12-01

    We report our experience on migrating STARs Online Services (Run Control System, Data Acquisition System, Slow Control System and Subsystem Monitoring) from direct read/write database accesses to a modern non-blocking message-oriented infrastructure. Based on the Advanced Messaging Queuing Protocol (AMQP) and standards, this novel approach does not specify the message data structure, allowing great flexibility in its use. After careful consideration, we chose Google Protocol Buffers as our primary (de)serialization format for structured data exchange. This migration allows us to reduce the overall system complexity and greatly improve the reliability of the metadata collection and the performance of our online services in general. We will present this new framework through its software architecture overview, providing details about our staged and non-disruptive migration process as well as details of the implementation of pluggable components to provide future improvements without compromising stability and availability of services.

  7. Based on Regular Expression Matching of Evaluation of the Task Performance in WSN: A Queue Theory Approach

    PubMed Central

    Cui, Kai; Zhou, Kuanjiu; Yu, Yanshuo

    2014-01-01

    Due to the limited resources of wireless sensor network, low efficiency of real-time communication scheduling, poor safety defects, and so forth, a queuing performance evaluation approach based on regular expression match is proposed, which is a method that consists of matching preprocessing phase, validation phase, and queuing model of performance evaluation phase. Firstly, the subset of related sequence is generated in preprocessing phase, guiding the validation phase distributed matching. Secondly, in the validation phase, the subset of features clustering, the compressed matching table is more convenient for distributed parallel matching. Finally, based on the queuing model, the sensor networks of task scheduling dynamic performance are evaluated. Experiments show that our approach ensures accurate matching and computational efficiency of more than 70%; it not only effectively detects data packets and access control, but also uses queuing method to determine the parameters of task scheduling in wireless sensor networks. The method for medium scale or large scale distributed wireless node has a good applicability. PMID:25401151

  8. Based on regular expression matching of evaluation of the task performance in WSN: a queue theory approach.

    PubMed

    Wang, Jie; Cui, Kai; Zhou, Kuanjiu; Yu, Yanshuo

    2014-01-01

    Due to the limited resources of wireless sensor network, low efficiency of real-time communication scheduling, poor safety defects, and so forth, a queuing performance evaluation approach based on regular expression match is proposed, which is a method that consists of matching preprocessing phase, validation phase, and queuing model of performance evaluation phase. Firstly, the subset of related sequence is generated in preprocessing phase, guiding the validation phase distributed matching. Secondly, in the validation phase, the subset of features clustering, the compressed matching table is more convenient for distributed parallel matching. Finally, based on the queuing model, the sensor networks of task scheduling dynamic performance are evaluated. Experiments show that our approach ensures accurate matching and computational efficiency of more than 70%; it not only effectively detects data packets and access control, but also uses queuing method to determine the parameters of task scheduling in wireless sensor networks. The method for medium scale or large scale distributed wireless node has a good applicability. PMID:25401151

  9. Evaluation of Job Queuing/Scheduling Software: Phase I Report

    NASA Technical Reports Server (NTRS)

    Jones, James Patton

    1996-01-01

    The recent proliferation of high performance work stations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, the national Aerodynamic Simulation (NAS) supercomputer facility compiled a requirements checklist for job queuing/scheduling software. Next, NAS began an evaluation of the leading job management system (JMS) software packages against the checklist. This report describes the three-phase evaluation process, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still insufficient, even in the leading JMS's. However, by ranking each JMS evaluated against the requirements, we provide data that will be useful to other sites in selecting a JMS.

  10. Second Evaluation of Job Queuing/Scheduling Software. Phase 1

    NASA Technical Reports Server (NTRS)

    Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)

    1997-01-01

    The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.

  11. Theory-Based Evaluation: Reflections Ten Years On. Theory-Based Evaluation: Past, Present, and Future

    ERIC Educational Resources Information Center

    Rogers, Patricia J.; Weiss, Carol H.

    2007-01-01

    This chapter begins with a brief introduction by Rogers, in which she highlights the continued salience of Carol Weiss's decade-old questions about theory-based evaluation. Theory-based evaluation has developed significantly since Carol Weiss's chapter was first published ten years ago. In 1997 Weiss pointed to theory-based evaluation being mostly

  12. Spectrally queued feature selection for robotic visual odometery

    NASA Astrophysics Data System (ADS)

    Pirozzo, David M.; Frederick, Philip A.; Hunt, Shawn; Theisen, Bernard; Del Rose, Mike

    2011-01-01

    Over the last two decades, research in Unmanned Vehicles (UV) has rapidly progressed and become more influenced by the field of biological sciences. Researchers have been investigating mechanical aspects of varying species to improve UV air and ground intrinsic mobility, they have been exploring the computational aspects of the brain for the development of pattern recognition and decision algorithms and they have been exploring perception capabilities of numerous animals and insects. This paper describes a 3 month exploratory applied research effort performed at the US ARMY Research, Development and Engineering Command's (RDECOM) Tank Automotive Research, Development and Engineering Center (TARDEC) in the area of biologically inspired spectrally augmented feature selection for robotic visual odometry. The motivation for this applied research was to develop a feasibility analysis on multi-spectrally queued feature selection, with improved temporal stability, for the purposes of visual odometry. The intended application is future semi-autonomous Unmanned Ground Vehicle (UGV) control as the richness of data sets required to enable human like behavior in these systems has yet to be defined.

  13. Basing quantum theory on information processing

    NASA Astrophysics Data System (ADS)

    Barnum, Howard

    2008-03-01

    I consider information-based derivations of the quantum formalism, in a framework encompassing quantum and classical theory and a broad spectrum of theories serving as foils to them. The most ambitious hope for such a derivation is a role analogous to Einstein's development of the dynamics and kinetics of macroscopic bodies, and later of their gravitational interactions, on the basis of simple principles with clear operational meanings and experimental consequences. Short of this, it could still provide a principled understanding of the features of quantum mechanics that account for its greater-than-classical information-processing power, helping guide the search for new quantum algorithms and protocols. I summarize the convex operational framework for theories, and discuss information-processing in theories therein. Results include the fact that information that can be obtained without disturbance is inherently classical, generalized no-cloning and no-broadcasting theorems, exponentially secure bit commitment in all non-classical theories without entanglement, properties of theories that allow teleportation, and properties of theories that allow ``remote steering'' of ensembles using entanglement. Joint work with collaborators including Jonathan Barrett, Matthew Leifer, Alexander Wilce, Oscar Dahlsten, and Ben Toner.

  14. Modelling Pedestrian Travel Time and the Design of Facilities: A Queuing Approach

    PubMed Central

    Rahman, Khalidur; Abdul Ghani, Noraida; Abdulbasah Kamil, Anton; Mustafa, Adli; Kabir Chowdhury, Md. Ahmed

    2013-01-01

    Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities. PMID:23691055

  15. Evacuation time estimate for total pedestrian evacuation using a queuing network model and volunteered geographic information

    NASA Astrophysics Data System (ADS)

    Kunwar, Bharat; Simini, Filippo; Johansson, Anders

    2016-02-01

    Estimating city evacuation time is a nontrivial problem due to the interaction between thousands of individual agents, giving rise to various collective phenomena, such as bottleneck formation, intermittent flow, and stop-and-go waves. We present a mean field approach to draw relationships between road network spatial attributes, the number of evacuees, and the resultant evacuation time estimate (ETE). Using volunteered geographic information, we divide 50 United Kingdom cities into a total of 704 catchment areas (CAs) which we define as an area where all agents share the same nearest exit node. 90% of the agents are within ≈6 ,847 m of CA exit nodes with ≈13 ,778 agents/CA. We establish a characteristic flow rate from catchment area attributes (population, distance to exit node, and exit node width) and a mean flow rate in a free-flow regime by simulating total evacuations using an agent based "queuing network" model. We use these variables to determine a relationship between catchment area attributes and resultant ETEs. This relationship could enable emergency planners to make a rapid appraisal of evacuation strategies and help support decisions in the run up to a crisis.

  16. Modelling pedestrian travel time and the design of facilities: a queuing approach.

    PubMed

    Rahman, Khalidur; Ghani, Noraida Abdul; Kamil, Anton Abdulbasah; Mustafa, Adli; Kabir Chowdhury, Md Ahmed

    2013-01-01

    Pedestrian movements are the consequence of several complex and stochastic facts. The modelling of pedestrian movements and the ability to predict the travel time are useful for evaluating the performance of a pedestrian facility. However, only a few studies can be found that incorporate the design of the facility, local pedestrian body dimensions, the delay experienced by the pedestrians, and level of service to the pedestrian movements. In this paper, a queuing based analytical model is developed as a function of relevant determinants and functional factors to predict the travel time on pedestrian facilities. The model can be used to assess the overall serving rate or performance of a facility layout and correlate it to the level of service that is possible to provide the pedestrians. It has also the ability to provide a clear suggestion on the designing and sizing of pedestrian facilities. The model is empirically validated and is found to be a robust tool to understand how well a particular walking facility makes possible comfort and convenient pedestrian movements. The sensitivity analysis is also performed to see the impact of some crucial parameters of the developed model on the performance of pedestrian facilities. PMID:23691055

  17. Jigsaw Cooperative Learning: Acid-Base Theories

    ERIC Educational Resources Information Center

    Tarhan, Leman; Sesen, Burcin Acar

    2012-01-01

    This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…

  18. The Scope of Usage-Based Theory

    PubMed Central

    Ibbotson, Paul

    2013-01-01

    Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the “cognitive commitment” of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works. PMID:23658552

  19. Flocculation control study based on fractal theory*

    PubMed Central

    Chang, Ying; Liu, Qian-jun; Zhang, Jin-song

    2005-01-01

    A study on flocculation control based on fractal theory was carried out. Optimization test of chemical coagulant dosage confirmed that the fractal dimension could reflect the flocculation degree and settling characteristics of aggregates and the good correlation with the turbidity of settled effluent. So that the fractal dimension can be used as the major parameter for flocculation system control and achieve self-acting adjustment of chemical coagulant dosage. The fractal dimension flocculation control system was used for further study carried out on the effects of various flocculation parameters, among which are the dependency relationship among aggregates fractal dimension, chemical coagulant dosage, and turbidity of settled effluent under the conditions of variable water quality and quantity. And basic experimental data were obtained for establishing the chemical coagulant dosage control model mainly based on aggregates fractal dimension. PMID:16187420

  20. Discrete-time Queuing Analysis of Opportunistic Spectrum Access: Single User Case

    NASA Astrophysics Data System (ADS)

    Wang, Jin-long; Xu, Yu-hua; Gao, Zhan; Wu, Qi-hui

    2011-11-01

    This article studies the discrete-time queuing dynamics of opportunistic spectrum access (OSA) systems, in which the secondary user seeks spectrum vacancies between bursty transmissions of the primary user to communicate. Since spectrum sensing and data transmission can not be performed simultaneously, the secondary user employs a sensing-then-transmission strategy to detect the presence of the primary user before accessing the licensed channel. Consequently, the transmission of the secondary user is periodically suspended for spectrum sensing. To capture the discontinuous transmission nature of the secondary user, we introduce a discrete-time queuing subjected to bursty preemption to describe the behavior of the secondary user. Specifically, we derive some important metrics of the secondary user, including secondary spectrum utilization ratio, buffer length, packet delay and packet dropping ratio. Finally, simulation results validate the proposed theoretical model and reveal that the theoretical results fit the simulated results well.

  1. MODELING AND PERFORMANCE EVALUATION FOR AVIATION SECURITY CARGO INSPECTION QUEUING SYSTEM

    SciTech Connect

    Allgood, Glenn O; Olama, Mohammed M; Rose, Terri A; Brumback, Daryl L

    2009-01-01

    Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we conduct performance evaluation study for an aviation security cargo inspection queuing system for material flow and accountability. The overall performance of the aviation security cargo inspection system is computed, analyzed, and optimized for the different system dynamics. Various performance measures are considered such as system capacity, residual capacity, and throughput. These metrics are performance indicators of the system s ability to service current needs and response capacity to additional requests. The increased physical understanding resulting from execution of the queuing model utilizing these vetted performance measures will reduce the overall cost and shipping delays associated with the new inspection requirements.

  2. Towards a Faith-Based Program Theory: A Reconceptualization of Program Theory

    ERIC Educational Resources Information Center

    Harden, Mark G.

    2006-01-01

    A meta-program theory is proposed to overcome the limitations and improve the use of program theory as an approach to faith-based program evaluation. The essentials for understanding religious organizations, their various programs, and faith and spirituality are discussed to support a rationale for developing a faith-based program theory that…

  3. A Multiple Constraint Queuing Model for Predicting Current and Future Terminal Area Capacities

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2004-01-01

    A new queuing model is being developed to evaluate the capacity benefits of several new concepts for terminal airspace operations. The major innovation is the ability to support a wide variety of multiple constraints for modeling the scheduling logic of several concepts. Among the constraints modeled are in-trail separation, separation between aircraft landing on parallel runways, in-trail separation at terminal area entry points, and permissible terminal area flight times.

  4. An Improved Call Admission Control Mechanism with Prioritized Handoff Queuing Scheme for BWA Networks

    NASA Astrophysics Data System (ADS)

    Chowdhury, Prasun; Saha Misra, Iti

    2014-10-01

    Nowadays, due to increased demand for using the Broadband Wireless Access (BWA) networks in a satisfactory manner a promised Quality of Service (QoS) is required to manage the seamless transmission of the heterogeneous handoff calls. To this end, this paper proposes an improved Call Admission Control (CAC) mechanism with prioritized handoff queuing scheme that aims to reduce dropping probability of handoff calls. Handoff calls are queued when no bandwidth is available even after the allowable bandwidth degradation of the ongoing calls and get admitted into the network when an ongoing call is terminated with a higher priority than the newly originated call. An analytical Markov model for the proposed CAC mechanism is developed to analyze various performance parameters. Analytical results show that our proposed CAC with handoff queuing scheme prioritizes the handoff calls effectively and reduces dropping probability of the system by 78.57% for real-time traffic without degrading the number of failed new call attempts. This results in the increased bandwidth utilization of the network.

  5. Assessing the Queuing Process Using Data Envelopment Analysis: an Application in Health Centres.

    PubMed

    Safdar, Komal A; Emrouznejad, Ali; Dey, Prasanta K

    2016-01-01

    Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients' department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages. PMID:26558394

  6. Teaching a Theory-Based Sociology of Gender Course.

    ERIC Educational Resources Information Center

    Blee, Kathleen M.

    1986-01-01

    Presents the advantages of structuring a course on gender around the classical and contemporary sociological theories from which feminist theories are derived. Contrasts this approach against the more prevalent research-based approach. Provides suggestions on selecting and using theory and presents practical considerations in teaching such a…

  7. Theory Based Approaches to Learning. Implications for Adult Educators.

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; Jones, Edward V.

    This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is

  8. A Theory-Based Approach to Public Relations Ethics.

    ERIC Educational Resources Information Center

    Bivens, Thomas H.

    1991-01-01

    Discusses the three areas that need to be addressed when considering the most beneficial context for teaching public relations ethics: core concepts and theories; relevant ethical theories; and the context in which the theory-based approach should be taught. (MG)

  9. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  10. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  11. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  12. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  13. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 23 Highways 1 2011-04-01 2011-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt...

  14. Maximum entropy principle based estimation of performance distribution in queueing theory.

    PubMed

    He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

    2014-01-01

    In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated. PMID:25207992

  15. Aviation security cargo inspection queuing simulation model for material flow and accountability

    SciTech Connect

    Olama, Mohammed M; Allgood, Glenn O; Rose, Terri A; Brumback, Daryl L

    2009-01-01

    Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we develop an aviation security cargo inspection queuing simulation model for material flow and accountability that will allow cargo managers to conduct impact studies of current and proposed business practices as they relate to inspection procedures, material flow, and accountability.

  16. Aviation security cargo inspection queuing simulation model for material flow and accountability

    NASA Astrophysics Data System (ADS)

    Allgood, Glenn O.; Olama, Mohammed M.; Rose, Terri A.; Brumback, Daryl

    2009-05-01

    Beginning in 2010, the U.S. will require that all cargo loaded in passenger aircraft be inspected. This will require more efficient processing of cargo and will have a significant impact on the inspection protocols and business practices of government agencies and the airlines. In this paper, we develop an aviation security cargo inspection queuing simulation model for material flow and accountability that will allow cargo managers to conduct impact studies of current and proposed business practices as they relate to inspection procedures, material flow, and accountability.

  17. Theory-Based University Admissions Testing for a New Millennium

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2004-01-01

    This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…

  18. A Theory-Based Computer Tutorial Model.

    ERIC Educational Resources Information Center

    Dixon, Robert C.; Clapp, Elizabeth J.

    Because of the need for models to illustrate some possible answers to practical courseware development questions, a specific, three-section model incorporating the Corrective Feedback Paradigm (PCP) is advanced for applying theory to courseware. The model is reconstructed feature-by-feature against a framework of a hypothetical, one-to-one,…

  19. Theory of fracture mechanics based upon plasticity

    NASA Technical Reports Server (NTRS)

    Lee, J. D.

    1976-01-01

    A theory of fracture mechanics is formulated on the foundation of continuum mechanics. Fracture surface is introduced as an unknown quantity and is incorporated into boundary and initial conditions. Surface energy is included in the global form of energy conservation law and the dissipative mechanism is formulated into constitutive equations which indicate the thermodynamic irreversibility and the irreversibility of fracture process as well.

  20. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    PubMed

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms. PMID:24283669

  1. Theory of friction based on brittle fracture

    USGS Publications Warehouse

    Byerlee, J.D.

    1967-01-01

    A theory of friction is presented that may be more applicable to geologic materials than the classic Bowden and Tabor theory. In the model, surfaces touch at the peaks of asperities and sliding occurs when the asperities fail by brittle fracture. The coefficient of friction, ??, was calculated from the strength of asperities of certain ideal shapes; for cone-shaped asperities, ?? is about 0.1 and for wedge-shaped asperities, ?? is about 0.15. For actual situations which seem close to the ideal model, observed ?? was found to be very close to 0.1, even for materials such as quartz and calcite with widely differing strengths. If surface forces are present, the theory predicts that ?? should decrease with load and that it should be higher in a vacuum than in air. In the presence of a fluid film between sliding surfaces, ?? should depend on the area of the surfaces in contact. Both effects are observed. The character of wear particles produced during sliding and the way in which ?? depends on normal load, roughness, and environment lend further support to the model of friction presented here. ?? 1967 The American Institute of Physics.

  2. Task-Based Language Teaching and Expansive Learning Theory

    ERIC Educational Resources Information Center

    Robertson, Margaret

    2014-01-01

    Task-Based Language Teaching (TBLT) has become increasingly recognized as an effective pedagogy, but its location in generalized sociocultural theories of learning has led to misunderstandings and criticism. The purpose of this article is to explain the congruence between TBLT and Expansive Learning Theory and the benefits of doing so. The merit…

  3. Evaluation Theory in Problem-Based Learning Approach.

    ERIC Educational Resources Information Center

    Hsu, Yu-chen

    The purpose of this paper is to review evaluation theories and techniques in both the medical and educational fields and to propose an evaluation theory to explain the condition variables, the method variables, and the outcome variables of student assessment in a problem-based learning (PBL) approach. The PBL definition and process are presented,…

  4. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  5. Modeling the emergency cardiac in-patient flow: an application of queuing theory.

    PubMed

    de Bruin, Arnoud M; van Rossum, A C; Visser, M C; Koole, G M

    2007-06-01

    This study investigates the bottlenecks in the emergency care chain of cardiac in-patient flow. The primary goal is to determine the optimal bed allocation over the care chain given a maximum number of refused admissions. Another objective is to provide deeper insight in the relation between natural variation in arrivals and length of stay and occupancy rates. The strong focus on raising occupancy rates of hospital management is unrealistic and counterproductive. Economies of scale cannot be neglected. An important result is that refused admissions at the First Cardiac Aid (FCA) are primarily caused by unavailability of beds downstream the care chain. Both variability in LOS and fluctuations in arrivals result in large workload variations. Techniques from operations research were successfully used to describe the complexity and dynamics of emergency in-patient flow. PMID:17608054

  6. Research on the Optimization Method of Maintenance Support Unit Configuration with Queuing Theory

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Xu, Ying; Dong, Yue; Hou, Na; Yu, Yongli

    Beginning with the conception of maintenance support unit, the maintenance support flow is analyzed, so as to confirm the relation between damaged equipment mean-time-to-repair and the number of maintenance support units. On that basis, the maintenance support unit configuration optimization model is found aiming at the minimum cost of maintenance support resources, of which the solution is given. And the process is explained at the last with a example.

  7. School-Based Management: Theory and Practice.

    ERIC Educational Resources Information Center

    George, Patricia, Ed.; Potter, Eugenia Cooper, Ed.

    School-based management (SBM), sometimes called site-based management, is fast becoming the hottest restructuring item in the arsenal of reformers, teachers' unions, governors, and legislators who want to change the traditional ways in which schools and school districts do business. This document comprises three main sections with contributions…

  8. Unifying ecology and macroevolution with individual-based theory.

    PubMed

    Rosindell, James; Harmon, Luke J; Etienne, Rampal S

    2015-05-01

    A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology. PMID:25818618

  9. Measurement Theory in Deutsch's Algorithm Based on the Truth Values

    NASA Astrophysics Data System (ADS)

    Nagata, Koji; Nakamura, Tadao

    2016-03-01

    We propose a new measurement theory, in qubits handling, based on the truth values, i.e., the truth T (1) for true and the falsity F (0) for false. The results of measurement are either 0 or 1. To implement Deutsch's algorithm, we need both observability and controllability of a quantum state. The new measurement theory can satisfy these two. Especially, we systematically describe our assertion based on more mathematical analysis using raw data in a thoughtful experiment.

  10. Modeling Air Traffic Management Technologies with a Queuing Network Model of the National Airspace System

    NASA Technical Reports Server (NTRS)

    Long, Dou; Lee, David; Johnson, Jesse; Gaier, Eric; Kostiuk, Peter

    1999-01-01

    This report describes an integrated model of air traffic management (ATM) tools under development in two National Aeronautics and Space Administration (NASA) programs -Terminal Area Productivity (TAP) and Advanced Air Transport Technologies (AATT). The model is made by adjusting parameters of LMINET, a queuing network model of the National Airspace System (NAS), which the Logistics Management Institute (LMI) developed for NASA. Operating LMINET with models of various combinations of TAP and AATT will give quantitative information about the effects of the tools on operations of the NAS. The costs of delays under different scenarios are calculated. An extension of Air Carrier Investment Model (ACIM) under ASAC developed by the Institute for NASA maps the technologies' impacts on NASA operations into cross-comparable benefits estimates for technologies and sets of technologies.

  11. Improved virtual queuing and dynamic EPD techniques for TCP over ATM

    SciTech Connect

    Wu, Y.; Siu, K.Y.; Ren, W.

    1998-11-01

    It is known that TCP throughput can degrade significantly over UBR service in a congested ATM network, and the early packet discard (EPD) technique has been proposed to improve the performance. However, recent studies show that EPD cannot ensure fairness among competing VCs in a congested network, but the degree of fairness can be improved using various forms of fair buffer allocation techniques. The authors propose an improved scheme that utilizes only a single shared FIFO queue for all VCs and admits simple implementation for high speed ATM networks. The scheme achieves nearly perfect fairness and throughput among multiple TCP connections, comparable to the expensive per-VC queuing technique. Analytical and simulation results are presented to show the validity of this new scheme and significant improvement in performance as compared with existing fair buffer allocation techniques for TCP over ATM.

  12. Elastic theory of origami-based metamaterials

    NASA Astrophysics Data System (ADS)

    Brunck, V.; Lechenault, F.; Reid, A.; Adda-Bedia, M.

    2016-03-01

    Origami offers the possibility for new metamaterials whose overall mechanical properties can be programed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would like to determine the shape taken by the structure at rest and its mechanical response. In this article, we introduce a vector deformation field acting on the imprinted network of creases that allows us to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n -creases meeting at a single vertex. Computations of the equilibrium states are then carried out explicitly in two special cases: the generalized waterbomb base and the Miura-Ori. For the waterbomb, we show a generic bistability for any number of creases. For the Miura folding, however, we uncover a phase transition from monostable to bistable states that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. Moreover, the analysis shows that geometric frustration induces residual stresses in origami structures that should be taken into account in determining their mechanical response. This formalism can be extended to a general crease network, ordered or otherwise, and so opens new perspectives for the mechanics and the physics of origami-based metamaterials.

  13. Elastic theory of origami-based metamaterials

    NASA Astrophysics Data System (ADS)

    Lechenault, Frederic; Brunck, V.; Reid, A.; Adda-Bedia, M.

    Origami offers the possibility for new metamaterials whose overall mechanical properties can be programmed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would aim to determine the shape taken by the structure at rest and its mechanical response. We introduce a vector deformation field acting on the imprinted network of creases, that allows to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n-creases meeting at a single vertex, and then extended to origami tesselations. The generalized waterbomb base and the Miura-Ori are treated within this formalism. For the Miura folding, we uncover a phase transition from monostable to two metastable states, that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. This research was supported by the ANR Grant 14-CE07-0031 METAMAT.

  14. On the Complexity of Constraint-Based Theory Extraction

    NASA Astrophysics Data System (ADS)

    Boley, Mario; Gärtner, Thomas

    In this paper we rule out output polynomial listing algorithms for the general problem of discovering theories for a conjunction of monotone and anti-monotone constraints as well as for the particular subproblem in which all constraints are frequency-based. For the general problem we prove a concrete exponential lower time bound that holds for any correct algorithm and even in cases in which the size of the theory as well as the only previous bound are constant. For the case of frequency-based constraints our result holds unless P = NP. These findings motivate further research to identify tractable subproblems and justify approaches with exponential worst case complexity.

  15. A Natural Teaching Method Based on Learning Theory.

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    1991-01-01

    The natural teaching method is active and student-centered, based on schema and constructivist theories, and informed by research in neuroplasticity. A schema is a mental picture or understanding of something we have learned. Humans can have knowledge only to the degree to which they have constructed schemas from learning experiences and practice.…

  16. A Memory-Based Theory of Verbal Cognition

    ERIC Educational Resources Information Center

    Dennis, Simon

    2005-01-01

    The syntagmatic paradigmatic model is a distributed, memory-based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long-term memory and the resolution of these…

  17. A Memory-Based Theory of Verbal Cognition

    ERIC Educational Resources Information Center

    Dennis, Simon

    2005-01-01

    The syntagmatic paradigmatic model is a distributed, memory-based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long-term memory and the resolution of these

  18. Toward an Instructionally Oriented Theory of Example-Based Learning

    ERIC Educational Resources Information Center

    Renkl, Alexander

    2014-01-01

    Learning from examples is a very effective means of initial cognitive skill acquisition. There is an enormous body of research on the specifics of this learning method. This article presents an instructionally oriented theory of example-based learning that integrates theoretical assumptions and findings from three research areas: learning from…

  19. Project-Based Language Learning: An Activity Theory Analysis

    ERIC Educational Resources Information Center

    Gibbes, Marina; Carson, Lorna

    2014-01-01

    This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…

  20. Theory-Based Considerations Influence the Interpretation of Generic Sentences

    ERIC Educational Resources Information Center

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., "Swans are beautiful") is true? We hypothesised that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on…

  1. A Model of Statistics Performance Based on Achievement Goal Theory.

    ERIC Educational Resources Information Center

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  2. A Discourse-Based Theory of Interdisciplinary Connections

    ERIC Educational Resources Information Center

    Nowacek, Rebecca S.

    2005-01-01

    Working inductively from classroom research and guided by a Bakhtinian view of language, the author proposes a discourse-based theory of interdisciplinary connections. The article includes examples of four discursive resources individuals draw on to make interdisciplinary connections--content, propositions, ways of knowing, and classroom…

  3. Reasserting Theory in Professionally Based Initial Teacher Education

    ERIC Educational Resources Information Center

    Hodson, Elaine; Smith, Kim; Brown, Tony

    2012-01-01

    Conceptions of theory within initial teacher education in England are adjusting to new conditions where most learning how to teach is school-based. Student teachers on a programme situated primarily in an employing school were monitored within a practitioner enquiry by their university programme tutors according to how they progressively…

  4. A Pedagogy of Blending Theory with Community-Based Research

    ERIC Educational Resources Information Center

    Brown, Kathleen Taylor

    2011-01-01

    Blending activity theory and community-based research educational applications describes the praxis achieved through the initial design, development, implementation, and assessment of one research methods course as a pedagogy to enhance and improve the outcomes of civic and community engagement for the university, its students, and the community.…

  5. Field-Based Concerns about Fourth-Generation Evaluation Theory.

    ERIC Educational Resources Information Center

    Lai, Morris K.

    Some aspects of fourth generation evaluation procedures that have been advocated by E. G. Guba and Y. S. Lincoln were examined empirically, with emphasis on areas where there have been discrepancies between theory and field-based experience. In fourth generation evaluation, the product of an evaluation is not a set of conclusions, recommendations,…

  6. Nano-resonator frequency response based on strain gradient theory

    NASA Astrophysics Data System (ADS)

    Maani Miandoab, Ehsan; Yousefi-Koma, Aghil; Nejat Pishkenari, Hossein; Fathi, Mohammad

    2014-09-01

    This paper aims to explore the dynamic behaviour of a nano-resonator under ac and dc excitation using strain gradient theory. To achieve this goal, the partial differential equation of nano-beam vibration is first converted to an ordinary differential equation by the Galerkin projection method and the lumped model is derived. Lumped parameters of the nano-resonator, such as linear and nonlinear springs and damper coefficients, are compared with those of classical theory and it is demonstrated that beams with smaller thickness display greater deviation from classical parameters. Stable and unstable equilibrium points based on classic and non-classical theories are also compared. The results show that, regarding the applied dc voltage, the dynamic behaviours expected by classical and non-classical theories are significantly different, such that one theory predicts the un-deformed shape as the stable condition, while the other theory predicts that the beam will experience bi-stability. To obtain the frequency response of the nano-resonator, a general equation including cubic and quadratic nonlinearities in addition to parametric electrostatic excitation terms is derived, and the analytical solution is determined using a second-order multiple scales method. Based on frequency response analysis, the softening and hardening effects given by two theories are investigated and compared, and it is observed that neglecting the size effect can lead to two completely different predictions in the dynamic behaviour of the resonators. The findings of this article can be helpful in the design and characterization of the size-dependent dynamic behaviour of resonators on small scales.

  7. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  8. Innovating Method of Existing Mechanical Product Based on TRIZ Theory

    NASA Astrophysics Data System (ADS)

    Zhao, Cunyou; Shi, Dongyan; Wu, Han

    Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.

  9. Theory-Based Programme Development and Evaluation in Physiotherapy

    PubMed Central

    Kay, Theresa; Klinck, Beth

    2008-01-01

    ABSTRACT Purpose: Programme evaluation has been defined as “the systematic process of collecting credible information for timely decision making about a particular program.” Where possible, findings are used to develop, revise, and improve programmes. Theory-based programme development and evaluation provides a comprehensive approach to programme evaluation. Summary of key points: In order to obtain meaningful information from evaluation activities, relevant programme components need to be understood. Theory-based programme development and evaluation starts with a comprehensive description of the programme. A useful tool to describe a programme is the Sidani and Braden Model of Program Theory, consisting of six programme components: problem definition, critical inputs, mediating factors, expected outcomes, extraneous factors, and implementation issues. Articulation of these key components may guide physiotherapy programme implementation and delivery and assist in the development of key evaluation questions and methodologies. Using this approach leads to a better understanding of client needs, programme processes, and programme outcomes and can help to identify barriers to and enablers of successful implementation. Two specific examples, representing public and private sectors, will illustrate the application of this approach to clinical practice. Conclusions: Theory-based programme development helps clinicians, administrators, and researchers develop an understanding of who benefits the most from which types of programmes and facilitates the implementation of processes to improve programmes. PMID:20145741

  10. Generalizations of Gravitational Theory Based on Group Covariance

    NASA Astrophysics Data System (ADS)

    Halpern, Leopold

    1982-10-01

    The mathematical structure, the field equations, and fundamentals of the kinematics of generalizations of general relativity based on semisimple invariance groups are presented. The structure is that of a generalized Kaluza-Klein theory with a subgroup as the gauge group. The group manifold with its Cartan-Killing metric forms the source-free solution. The gauge fields do not vanish even in this case and give rise to additional modes of free motion. The case of the de Sitter groups is presented as an example where the gauge field is tentatively assumed to mediate a spin interaction and give rise to spin motion. Generalization to the conformal group and a theory yielding features of Dirac's large-number hypothesis are discussed. The possibility of further generalizations to include fermions are pointed out. The Kaluza-Klein theory is formulated in terms of principal fibre bundles which need not to be trivial.

  11. Two-scale mechanism-based theory of nonlinear viscoelasticity

    NASA Astrophysics Data System (ADS)

    Tang, Shan; Steven Greene, M.; Liu, Wing Kam

    2012-02-01

    The paper presents a mechanism-based two-scale theory for a generalized nonlinear viscoelastic continuum. The continuum is labeled as generalized since it contains extra degrees of freedom typical of past high-order continuum theories, though a new formulation is presented here tailored to meet the needs of the physical description of the viscoelastic solid. The microstress that appears in the equations, often criticized for a lack of physical meaning, is assigned in this work to viscous free chains superimposed on a nonlinear elastic backbone composed of crosslinks and reinforcement. Mathematically, hyperelasticity is used to describe the equilibrium backbone (macroscale), and an improvement of tube models for reptation dynamics describes the free chain motion at the microscale. Inhomogeneous deformation is described by inclusion of a microstrain gradient into the formulation. Thus, the theory is nicely suited for materials with microstructure where localization of strains and inhomogeneous deformation occur in addition to viscoelastic damping mechanisms due to free chains. Besides the microstress, physical meaning of the additional boundary conditions arising in the general theory is also presented. Since the proposed material model is mechanism-based, macroscopic performances are functions of microstructural variables describing the polymer chemistry so that parametric material design concepts may be gleaned from the model. Several physical phenomena are captured through numerical simulation of the class of materials of interest: size effects, strain localization, and the fracture process. Results agree qualitatively with both experimental data and direct numerical simulation for filled elastomeric solids.

  12. Ensemble method: Community detection based on game theory

    NASA Astrophysics Data System (ADS)

    Zhang, Xia; Xia, Zhengyou; Xu, Shengwu; Wang, J. D.

    2014-08-01

    Timely and cost-effective analytics over social network has emerged as a key ingredient for success in many businesses and government endeavors. Community detection is an active research area of relevance to analyze online social network. The problem of selecting a particular community detection algorithm is crucial if the aim is to unveil the community structure of a network. The choice of a given methodology could affect the outcome of the experiments because different algorithms have different advantages and depend on tuning specific parameters. In this paper, we propose a community division model based on the notion of game theory, which can combine advantages of previous algorithms effectively to get a better community classification result. By making experiments on some standard dataset, it verifies that our community detection model based on game theory is valid and better.

  13. Infrared small target detection based on Danger Theory

    NASA Astrophysics Data System (ADS)

    Lan, Jinhui; Yang, Xiao

    2009-11-01

    To solve the problem that traditional method can't detect the small objects whose local SNR is less than 2 in IR images, a Danger Theory-based model to detect infrared small target is presented in this paper. First, on the analog with immunology, the definition is given, in this paper, to such terms as dangerous signal, antigens, APC, antibodies. Besides, matching rule between antigen and antibody is improved. Prior to training the detection model and detecting the targets, the IR images are processed utilizing adaptive smooth filter to decrease the stochastic noise. Then at the training process, deleting rule, generating rule, crossover rule and the mutation rule are established after a large number of experiments in order to realize immediate convergence and obtain good antibodies. The Danger Theory-based model is built after the training process, and this model can detect the target whose local SNR is only 1.5.

  14. Research on Capturing of Customer Requirements Based on Innovation Theory

    NASA Astrophysics Data System (ADS)

    junwu, Ding; dongtao, Yang; zhenqiang, Bao

    To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.

  15. Theory-based considerations influence the interpretation of generic sentences

    PubMed Central

    Cimpian, Andrei; Gelman, Susan A.; Brandone, Amanda C.

    2010-01-01

    Under what circumstances do people agree that a kind-referring generic sentence (e.g., “Swans are beautiful”) is true? We hypothesized that theory-based considerations are sufficient, independently of prevalence/frequency information, to lead to acceptance of a generic statement. To provide evidence for this general point, we focused on demonstrating the impact of a specific theory-based, essentialist expectation–that the physical features characteristic of a biological kind emerge as a natural product of development–on participants’ reasoning about generics. Across 3 studies, adult participants (N = 99) confirmed our hypothesis, preferring to map generic sentences (e.g., “Dontrets have long tails”) onto novel categories for which the key feature (e.g., long tails) was absent in all the young but present in all the adults rather than onto novel categories for which the key feature was at least as prevalent but present in some of the young and in some of the adults. Control conditions using “some”- and “most”-quantified sentences demonstrated that this mapping is specific to generic meaning. These results suggest that generic meaning does not reduce to quantification and is sensitive to theory-based expectations. PMID:20352078

  16. What Communication Theories Can Teach the Designer of Computer-Based Training.

    ERIC Educational Resources Information Center

    Larsen, Ronald E.

    1985-01-01

    Reviews characteristics of computer-based training (CBT) that make application of communication theories appropriate and presents principles from communication theory (e.g., general systems theory, symbolic interactionism, rule theories, and interpersonal communication theories) to illustrate how CBT developers can profitably apply them to…

  17. Uncertainty analysis of groundwater modeling based on information entropy theory

    NASA Astrophysics Data System (ADS)

    Zeng, X.; Wu, J.

    2013-12-01

    Because of groundwater conceptualization uncertainty, multi-model methods are usually used and the corresponding uncertainties are estimated by integrating generalized likelihood uncertainty estimation (GLUE) and Bayesian model averaging (BMA) methods.As for uncertainty assessment, it is crucial to select appropriate theory to define and measure uncertainty. Generally, variance method is traditionally applied to measure the uncertainties of BMA prediction. The total variance of ensemble prediction is decomposed into within-model and between-model variances, which represent theuncertainties fromparameter and conceptual model, respectively. However, variance is not a perfect method for measuring the uncertainty of a probability distribution. Furthermore, the overlapped parameter uncertaintyderived from the combination of multi-model's predictions cannot be appropriatelyrepresented by variance method. A new measuring method based on information entropy theory is developed in this study.The information entropy is ageneral method for measuring the uncertainty of a predictive distribution, and the predictive uncertainty of BMA ensemble prediction is appropriately partitioned by this method. Based on a synthetical groundwater model, variance and information entropy methods are used to assess groundwater modeling uncertainties. The compared results indicate that information entropy method is more informative and authentic for measuring groundwater modeling uncertainty than variance method. Variance method is characterized by clear mechanism, easy computation, and easy understandable assessment result.Information entropy methodstrengths lie inreliable theory foundation and rational derivation to the partition of BMA predictive uncertainty.

  18. Experimental energy consumption of Frame Slotted ALOHA and Distributed Queuing for data collection scenarios.

    PubMed

    Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis; Vilajosana, Xavier

    2014-01-01

    Data collection is a key scenario for the Internet of Things because it enables gathering sensor data from distributed nodes that use low-power and long-range wireless technologies to communicate in a single-hop approach. In this kind of scenario, the network is composed of one coordinator that covers a particular area and a large number of nodes, typically hundreds or thousands, that transmit data to the coordinator upon request. Considering this scenario, in this paper we experimentally validate the energy consumption of two Medium Access Control (MAC) protocols, Frame Slotted ALOHA (FSA) and Distributed Queuing (DQ). We model both protocols as a state machine and conduct experiments to measure the average energy consumption in each state and the average number of times that a node has to be in each state in order to transmit a data packet to the coordinator. The results show that FSA is more energy efficient than DQ if the number of nodes is known a priori because the number of slots per frame can be adjusted accordingly. However, in such scenarios the number of nodes cannot be easily anticipated, leading to additional packet collisions and a higher energy consumption due to retransmissions. Contrarily, DQ does not require to know the number of nodes in advance because it is able to efficiently construct an ad hoc network schedule for each collection round. This kind of a schedule ensures that there are no packet collisions during data transmission, thus leading to an energy consumption reduction above 10% compared to FSA. PMID:25061839

  19. Experimental Energy Consumption of Frame Slotted ALOHA and Distributed Queuing for Data Collection Scenarios

    PubMed Central

    Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis; Vilajosana, Xavier

    2014-01-01

    Data collection is a key scenario for the Internet of Things because it enables gathering sensor data from distributed nodes that use low-power and long-range wireless technologies to communicate in a single-hop approach. In this kind of scenario, the network is composed of one coordinator that covers a particular area and a large number of nodes, typically hundreds or thousands, that transmit data to the coordinator upon request. Considering this scenario, in this paper we experimentally validate the energy consumption of two Medium Access Control (MAC) protocols, Frame Slotted ALOHA (FSA) and Distributed Queuing (DQ). We model both protocols as a state machine and conduct experiments to measure the average energy consumption in each state and the average number of times that a node has to be in each state in order to transmit a data packet to the coordinator. The results show that FSA is more energy efficient than DQ if the number of nodes is known a priori because the number of slots per frame can be adjusted accordingly. However, in such scenarios the number of nodes cannot be easily anticipated, leading to additional packet collisions and a higher energy consumption due to retransmissions. Contrarily, DQ does not require to know the number of nodes in advance because it is able to efficiently construct an ad hoc network schedule for each collection round. This kind of a schedule ensures that there are no packet collisions during data transmission, thus leading to an energy consumption reduction above 10% compared to FSA. PMID:25061839

  20. A Danger-Theory-Based Immune Network Optimization Algorithm

    PubMed Central

    Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853

  1. Quantum game theory based on the Schmidt decomposition

    NASA Astrophysics Data System (ADS)

    Ichikawa, Tsubasa; Tsutsui, Izumi; Cheon, Taksu

    2008-04-01

    We present a novel formulation of quantum game theory based on the Schmidt decomposition, which has the merit that the entanglement of quantum strategies is manifestly quantified. We apply this formulation to 2-player, 2-strategy symmetric games and obtain a complete set of quantum Nash equilibria. Apart from those available with the maximal entanglement, these quantum Nash equilibria are extensions of the Nash equilibria in classical game theory. The phase structure of the equilibria is determined for all values of entanglement, and thereby the possibility of resolving the dilemmas by entanglement in the game of Chicken, the Battle of the Sexes, the Prisoners' Dilemma, and the Stag Hunt, is examined. We find that entanglement transforms these dilemmas with each other but cannot resolve them, except in the Stag Hunt game where the dilemma can be alleviated to a certain degree.

  2. A Kendama Learning Robot Based on Bi-directional Theory.

    PubMed

    Kawato, Mitsuo; Wada, Yasuhiro; Nakano, Eri; Osu, Rieko; Koike, Yasuharu; Gomi, Hiroaki; Gandolfo, Francesca; Schaal, Stefan; Miyamoto, Hiroyuki

    1996-11-01

    A general theory of movement-pattern perception based on bi-directional theory for sensory-motor integration can be used for motion capture and learning by watching in robotics. We demonstrate our methods using the game of Kendama, executed by the SARCOS Dextrous Slave Arm, which has a very similar kinematic structure to the human arm. Three ingredients have to be integrated for the successful execution of this task. The ingredients are (1) to extract via-points from a human movement trajectory using a forward-inverse relaxation model, (2) to treat via-points as a control variable while reconstructing the desired trajectory from all the via-points, and (3) to modify the via-points for successful execution. In order to test the validity of the via-point representation, we utilized a numerical model of the SARCOS arm, and examined the behavior of the system under several conditions. Copyright 1996 Elsevier Science Ltd. PMID:12662536

  3. Control theory based airfoil design using the Euler equations

    NASA Technical Reports Server (NTRS)

    Jameson, Antony; Reuther, James

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.

  4. Theory of networked minority games based on strategy pattern dynamics.

    PubMed

    Lo, T S; Chan, H Y; Hui, P M; Johnson, N F

    2004-11-01

    We formulate a theory of agent-based models in which agents compete to be in a winning group. The agents may be part of a network or not, and the winning group may be a minority group or not. An important feature of the present formalism is its focus on the dynamical pattern of strategy rankings, and its careful treatment of the strategy ties which arise during the system's temporal evolution. We apply it to the minority game with connected populations. Expressions for the mean success rate among the agents and for the mean success rate for agents with k neighbors are derived. We also use the theory to estimate the value of connectivity p above which the binary-agent-resource system with high resource levels makes the transition into the high-connectivity state. PMID:15600687

  5. Effects of gauge theory based number scaling on geometry

    NASA Astrophysics Data System (ADS)

    Benioff, Paul

    2013-05-01

    Effects of local availability of mathematics (LAM) and space time dependent number scaling on physics and, especially, geometry are described. LAM assumes separate mathematical systems as structures at each space time point. Extension of gauge theories to include freedom of choice of scaling for number structures, and other structures based on numbers, results in a space time dependent scaling factor based on a scalar boson field. Scaling has no effect on comparison of experimental results with one another or with theory computations. With LAM all theory expressions are elements of mathematics at some reference point. Changing the reference point introduces (external) scaling. Theory expressions with integrals or derivatives over space or time include scaling factors (internal scaling) that cannot be removed by reference point change. Line elements and path lengths, as integrals over space and/or time, show the effect of scaling on geometry. In one example, the scaling factor goes to 0 as the time goes to 0, the big bang time. All path lengths, and values of physical quantities, are crushed to 0 as t goes to 0. Other examples have spherically symmetric scaling factors about some point, x. In one type, a black scaling hole, the scaling factor goes to infinity as the distance, d, between any point y and x goes to 0. For scaling white holes, the scaling factor goes to 0 as d goes to 0. For black scaling holes, path lengths from a reference point, z, to y become infinite as y approaches x. For white holes, path lengths approach a value much less than the unscaled distance from z to x.

  6. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk. PMID:24194413

  7. A model of resurgence based on behavioral momentum theory.

    PubMed

    Shahan, Timothy A; Sweeney, Mary M

    2011-01-01

    Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforcement during extinction of a target response acts as both an additional source of disruption during extinction and as a source of reinforcement in the context that increases the future strength of the target response. The model does a good job accounting for existing data in the resurgence literature and makes novel and testable predictions. Thus, the model appears to provide a framework for understanding resurgence and serves to integrate the phenomenon into the existing theoretical account of persistence provided by behavioral momentum theory. In addition, we discuss some potential implications of the model for further development of behavioral momentum theory. PMID:21541118

  8. Similarity theory based on the Dougherty-Ozmidov length scale

    NASA Astrophysics Data System (ADS)

    Grachev, Andrey A.; Andreas, Edgar L.; Fairall, Christopher W.; Guest, Peter S.; Persson, P. Ola G.

    2015-07-01

    Local similarity theory is suggested based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy instead the turbulent fluxes used in the traditional Monin-Obukhov similarity theory. Based on dimensional analysis (Pi theorem), it is shown that any properly scaled statistics of the small-scale turbulence are universal functions of a stability parameter defined as the ratio of a reference height z and the Dougherty-Ozmidov length scale which in the limit of z-less stratification is linearly proportional to the Obukhov length scale. Measurements of atmospheric turbulence made at five levels on a 20-m tower over the Arctic pack ice during the Surface Heat Budget of the Arctic Ocean experiment (SHEBA) are used to examine the behaviour of different similarity functions in the stable boundary layer. It is found that in the framework of this approach the non-dimensional turbulent viscosity is equal to the gradient Richardson number whereas the non-dimensional turbulent thermal diffusivity is equal to the flux Richardson number. These results are a consequence of the approximate local balance between production of turbulence by the mean flow shear and viscous dissipation. The turbulence framework based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy may have practical advantages for estimating turbulence when the fluxes are not directly available.

  9. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  10. Enhancement of infrared image based on the Retinex theory.

    PubMed

    Li, Ying; Hou, Changzhi; Tian, Fu; Yu, Hongli; Guo, Lei; Xu, Guizhi; Shen, Xueqin; Yan, Weili

    2007-01-01

    The infrared imaging technique can be used to image the temperature distribution of the body. It's hopeful to be applied to the diagnosis and prediction of many diseases. Image processing is necessary to enhance the original infrared images because of the blurring. In this paper, the image enhancement technique based on the Retinex theory is studied. The algorithms such as Frackle-McCann algorithm, McCann99 algorithm, single-scale Retinex algorithm and multi-scale Retinex algorithm are applied to the enhancement of gray infrared image. The acceptable results are obtained and compared. PMID:18002705

  11. Transportation optimization with fuzzy trapezoidal numbers based on possibility theory.

    PubMed

    He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

    2014-01-01

    In this paper, a parametric method is introduced to solve fuzzy transportation problem. Considering that parameters of transportation problem have uncertainties, this paper develops a generalized fuzzy transportation problem with fuzzy supply, demand and cost. For simplicity, these parameters are assumed to be fuzzy trapezoidal numbers. Based on possibility theory and consistent with decision-makers' subjectiveness and practical requirements, the fuzzy transportation problem is transformed to a crisp linear transportation problem by defuzzifying fuzzy constraints and objectives with application of fractile and modality approach. Finally, a numerical example is provided to exemplify the application of fuzzy transportation programming and to verify the validity of the proposed methods. PMID:25137239

  12. Identifying influential nodes in weighted networks based on evidence theory

    NASA Astrophysics Data System (ADS)

    Wei, Daijun; Deng, Xinyang; Zhang, Xiaoge; Deng, Yong; Mahadevan, Sankaran

    2013-05-01

    The design of an effective ranking method to identify influential nodes is an important problem in the study of complex networks. In this paper, a new centrality measure is proposed based on the Dempster-Shafer evidence theory. The proposed measure trades off between the degree and strength of every node in a weighted network. The influences of both the degree and the strength of each node are represented by basic probability assignment (BPA). The proposed centrality measure is determined by the combination of these BPAs. Numerical examples are used to illustrate the effectiveness of the proposed method.

  13. Resource based view: a promising new theory for healthcare organizations

    PubMed Central

    Ferlie, Ewan

    2014-01-01

    This commentary reviews a recent piece by Burton and Rycroft-Malone on the use of Resource Based View (RBV) in healthcare organizations. It first outlines the core content of their piece. It then discusses their attempts to extend RBV to the analysis of large scale quality improvement efforts in healthcare. Some critique is elaborated. The broader question of why RBV seems to be migrating into healthcare management research is considered. They conclude RBV is a promising new theory for healthcare organizations. PMID:25396211

  14. Developing a Theory-Based Simulation Educator Resource.

    PubMed

    Thomas, Christine M; Sievers, Lisa D; Kellgren, Molly; Manning, Sara J; Rojas, Deborah E; Gamblian, Vivian C

    2015-01-01

    The NLN Leadership Development Program for Simulation Educators 2014 faculty development group identified a lack of a common language/terminology to outline the progression of expertise of simulation educators. The group analyzed Benner's novice-to-expert model and applied its levels of experience to simulation educator growth. It established common operational categories of faculty development and used them to organize resources that support progression toward expertise. The resulting theory-based Simulator Educator Toolkit outlines levels of ability and provides quality resources to meet the diverse needs of simulation educators and team members. PMID:26521508

  15. Transportation Optimization with Fuzzy Trapezoidal Numbers Based on Possibility Theory

    PubMed Central

    He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

    2014-01-01

    In this paper, a parametric method is introduced to solve fuzzy transportation problem. Considering that parameters of transportation problem have uncertainties, this paper develops a generalized fuzzy transportation problem with fuzzy supply, demand and cost. For simplicity, these parameters are assumed to be fuzzy trapezoidal numbers. Based on possibility theory and consistent with decision-makers' subjectiveness and practical requirements, the fuzzy transportation problem is transformed to a crisp linear transportation problem by defuzzifying fuzzy constraints and objectives with application of fractile and modality approach. Finally, a numerical example is provided to exemplify the application of fuzzy transportation programming and to verify the validity of the proposed methods. PMID:25137239

  16. Theory of metascreen-based acoustic passive phased array

    NASA Astrophysics Data System (ADS)

    Li, Yong; Qi, Shuibao; Badreddine Assouar, M.

    2016-04-01

    The metascreen-based acoustic passive phased array provides a new degree of freedom for manipulating acoustic waves due to their fascinating properties, such as a fully shifting phase, keeping impedance matching, and holding subwavelength spatial resolution. We develop acoustic theories to analyze the transmission/reflection spectra and the refracted pressure fields of a metascreen composed of elements with four Helmholtz resonators (HRs) in series and a straight pipe. We find that these properties are also valid under oblique incidence with large angles, with the underlying physics stemming from the hybrid resonances between the HRs and the straight pipe. By imposing the desired phase profiles, the refracted fields can be tailored in an anomalous yet controllable manner. In particular, two types of negative refraction are exhibited, based on two distinct mechanisms: one is formed from classical diffraction theory and the other is dominated by the periodicity of the metascreen. Positive (normal) and negative refractions can be converted by simply changing the incident angle, with the coexistence of two types of refraction in a certain range of incident angles.

  17. Design 2000: Theory-Based Design Models of the Future.

    ERIC Educational Resources Information Center

    Richey, Rita C.

    The influence of theory on instructional-design models of the future is explored on the basis of the theoretical developments of today. Anticipated model changes are expected to result from disparate theoretical thinking in areas such as chaos theory, constructivism, situated learning, cognitive-learning theory, and general systems theory.…

  18. Optimisation of a honeybee-colony's energetics via social learning based on queuing delays

    NASA Astrophysics Data System (ADS)

    Thenius, Ronald; Schmickl, Thomas; Crailsheim, Karl

    2008-06-01

    Natural selection shaped the foraging-related processes of honeybees in such a way that a colony can react to changing environmental conditions optimally. To investigate this complex dynamic social system, we developed a multi-agent model of the nectar flow inside and outside of a honeybee colony. In a honeybee colony, a temporal caste collects nectar in the environment. These foragers bring their harvest into the colony, where they unload their nectar loads to one or more storer bees. Our model predicts that a cohort of foragers, collecting nectar from a single nectar source, is able to detect changes in quality in other food sources they have never visited, via the nectar processing system of the colony. We identified two novel pathways of forager-to-forager communication. Foragers can gain information about changes in the nectar flow in the environment via changes in their mean waiting time for unloadings and the number of experienced multiple unloadings. This way two distinct groups of foragers that forage on different nectar sources and that never communicate directly can share information via a third cohort of worker bees. We show that this noisy and loosely knotted social network allows a colony to perform collective information processing, so that a single forager has all necessary information available to be able to 'tune' its social behaviour, like dancing or dance-following. This way the net nectar gain of the colony is increased.

  19. Stochastic extension of cellular manufacturing systems: a queuing-based analysis

    NASA Astrophysics Data System (ADS)

    Fardis, Fatemeh; Zandi, Afagh; Ghezavati, Vahidreza

    2013-07-01

    Clustering parts and machines into part families and machine cells is a major decision in the design of cellular manufacturing systems which is defined as cell formation. This paper presents a non-linear mixed integer programming model to design cellular manufacturing systems which assumes that the arrival rate of parts into cells and machine service rate are stochastic parameters and described by exponential distribution. Uncertain situations may create a queue behind each machine; therefore, we will consider the average waiting time of parts behind each machine in order to have an efficient system. The objective function will minimize summation of idleness cost of machines, sub-contracting cost for exceptional parts, non-utilizing machine cost, and holding cost of parts in the cells. Finally, the linearized model will be solved by the Cplex solver of GAMS, and sensitivity analysis will be performed to illustrate the effectiveness of the parameters.

  20. Feature Selection with Neighborhood Entropy-Based Cooperative Game Theory

    PubMed Central

    Zeng, Kai; She, Kun; Niu, Xinzheng

    2014-01-01

    Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT) yield better performance than classical ones. PMID:25276120

  1. A method of threaten ordering based on cloud model theory

    NASA Astrophysics Data System (ADS)

    Yu, Xin; Wang, Hua; Jiao, Licheng

    2009-10-01

    Situation assessment (SA) is a complex decision-making process in modern aerial defense system. Threaten ordering (TO) problem is one of the most difficult steps in SA. There are a lot of uncertain factors in TO, and traditional evaluation methods have some disadvantages on algorithm measures. To overcome these disadvantages, a method of TO based on cloud model theory is proposed. Firstly the factors of air-targets threat evaluation are analyzed in this paper. Then the reasoning mechanism base on cloud model is designed. In this step the cloud model is used to depict the attributes of target, the model parameters are defined, and the reasoning rule library is established by using the method of Multiple Attribute Decision Making (MADM). The data of targets were input to the reasoning mechanism, and the threat value was gained. Finally the simulation experiments are given to verify the validity of the method.

  2. TDDFT-based local control theory for chemical reactions

    NASA Astrophysics Data System (ADS)

    Tavernelli, Ivano; Curchod, Basile F. E.; Penfold, Thomas J.

    In this talk I will describe the implementation of local control theory for laser pulse shaping within the framework of TDDFT-based nonadiabatic dynamics. The method is based on a set of modified Tully's surface hopping equations and provides an efficient way to control the population of a selected reactive state of interest through the coupling with an external time-dependent electric field generated on-the-fly during the dynamics. This approach is applied to the investigation of the photoinduced intramolecular proton transfer reaction in 4-hydroxyacridine in gas phase and in solution. The generated pulses reveal important information about the underlying excited-state nuclear dynamics highlighting the involvement of collective vibrational modes that would be neglected in studies performed on model systems. Finally, this approach can help to shed new light on the photophysics and photochemistry of complex molecular systems and guide the design of novel reaction paths.

  3. Theory based design and optimization of materials for spintronics applications

    NASA Astrophysics Data System (ADS)

    Xu, Tianyi

    The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.

  4. Plato: A localised orbital based density functional theory code

    NASA Astrophysics Data System (ADS)

    Kenny, S. D.; Horsfield, A. P.

    2009-12-01

    The Plato package allows both orthogonal and non-orthogonal tight-binding as well as density functional theory (DFT) calculations to be performed within a single framework. The package also provides extensive tools for analysing the results of simulations as well as a number of tools for creating input files. The code is based upon the ideas first discussed in Sankey and Niklewski (1989) [1] with extensions to allow high-quality DFT calculations to be performed. DFT calculations can utilise either the local density approximation or the generalised gradient approximation. Basis sets from minimal basis through to ones containing multiple radial functions per angular momenta and polarisation functions can be used. Illustrations of how the package has been employed are given along with instructions for its utilisation. Program summaryProgram title: Plato Catalogue identifier: AEFC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 219 974 No. of bytes in distributed program, including test data, etc.: 1 821 493 Distribution format: tar.gz Programming language: C/MPI and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux and Mac OS X Has the code been vectorised or parallelised?: Yes, up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Nature of problem: Density functional theory study of electronic structure and total energies of molecules, crystals and surfaces. Solution method: Localised orbital based density functional theory. Restrictions: Tight-binding and density functional theory only, no exact exchange. Unusual features: Both atom centred and uniform meshes available. Can deal with arbitrary angular momenta for orbitals, whilst still retaining Slater-Koster tables for accuracy. Running time: Test cases will run in a few minutes, large calculations may run for several days.

  5. Classification of topological crystalline insulators based on representation theory

    NASA Astrophysics Data System (ADS)

    Dong, Xiao-Yu; Liu, Chao-Xing

    2016-01-01

    Topological crystalline insulators define a new class of topological insulator phases with gapless surface states protected by crystalline symmetries. In this work, we present a general theory to classify topological crystalline insulator phases based on the representation theory of space groups. Our approach is to directly identify possible nontrivial surface states in a semi-infinite system with a specific surface, of which the symmetry property can be described by 17 two-dimensional space groups. We reproduce the existing results of topological crystalline insulators, such as mirror Chern insulators in the p m or p m m groups, Cn v topological insulators in the p 4 m ,p 31 m , and p 6 m groups, and topological nonsymmorphic crystalline insulators in the p g and p m g groups. Aside from these existing results, we also obtain the following results: (1) there are two integer mirror Chern numbers (Z2) in the p m group but only one (Z ) in the c m or p 3 m 1 group for both the spinless and spinful cases; (2) for the p m m (c m m ) groups, there is no topological classification in the spinless case but Z4 (Z2) classifications in the spinful case; (3) we show how topological crystalline insulator phase in the p g group is related to that in the p m group; (4) we identify topological classification of the p 4 m ,p 31 m , and p 6 m for the spinful case; (5) we find topological nonsymmorphic crystalline insulators also existing in p g g and p 4 g groups, which exhibit new features compared to those in p g and p m g groups. We emphasize the importance of the irreducible representations for the states at some specific high-symmetry momenta in the classification of topological crystalline phases. Our theory can serve as a guide for the search of topological crystalline insulator phases in realistic materials.

  6. Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis

    ERIC Educational Resources Information Center

    Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.

    2012-01-01

    Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…

  7. Treatment of adolescent sexual offenders: theory-based practice.

    PubMed

    Sermabeikian, P; Martinez, D

    1994-11-01

    The treatment of adolescent sexual offenders (ASO) has its theoretical underpinnings in social learning theory. Although social learning theory has been frequently cited in literature, a comprehensive application of this theory, as applied to practice, has not been mapped out. The social learning and social cognitive theories of Bandura appear to be particularly relevant to the group treatment of this population. The application of these theories to practice, as demonstrated in a program model, is discussed as a means of demonstrating how theory-driven practice methods can be developed. PMID:7850605

  8. The theory and phenomenology of perturbative QCD based jet quenching

    NASA Astrophysics Data System (ADS)

    Majumder, A.; van Leeuwen, M.

    2011-01-01

    The study of the structure of strongly interacting dense matter via hard jets is reviewed. High momentum partons produced in hard collisions produce a shower of gluons prior to undergoing the non-perturbative process of hadronization. In the presence of a dense medium this shower is modified due to scattering of the various partons off the constituents in the medium. The modified pattern of the final detected hadrons is then a probe of the structure of the medium as perceived by the jet. Starting from the factorization paradigm developed for the case of particle collisions, we review the basic underlying theory of medium induced gluon radiation based on perturbative Quantum Chromo Dynamics (pQCD) and current experimental results from Deep Inelastic Scattering on large nuclei and high energy heavy-ion collisions, emphasizing how these results constrain our understanding of energy loss. This review contains introductions to the theory of radiative energy loss, elastic energy loss, and the corresponding experimental observables and issues. We close with a discussion of important calculations and measurements that need to be carried out to complete the description of jet modification at high energies at future high energy colliders.

  9. A molecularly based theory for electron transfer reorganization energy

    NASA Astrophysics Data System (ADS)

    Zhuang, Bilin; Wang, Zhen-Gang

    2015-12-01

    Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule's permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory.

  10. A molecularly based theory for electron transfer reorganization energy.

    PubMed

    Zhuang, Bilin; Wang, Zhen-Gang

    2015-12-14

    Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule's permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory. PMID:26671385

  11. Quantum Hall transitions: An exact theory based on conformal restriction

    NASA Astrophysics Data System (ADS)

    Bettelheim, E.; Gruzberg, I. A.; Ludwig, A. W. W.

    2012-10-01

    We revisit the problem of the plateau transition in the integer quantum Hall effect. Here we develop an analytical approach for this transition, and for other two-dimensional disordered systems, based on the theory of “conformal restriction.” This is a mathematical theory that was recently developed within the context of the Schramm-Loewner evolution which describes the “stochastic geometry” of fractal curves and other stochastic geometrical fractal objects in two-dimensional space. Observables elucidating the connection with the plateau transition include the so-called point-contact conductances (PCCs) between points on the boundary of the sample, described within the language of the Chalker-Coddington network model for the transition. We show that the disorder-averaged PCCs are characterized by a classical probability distribution for certain geometric objects in the plane (which we call pictures), occurring with positive statistical weights, that satisfy the crucial so-called restriction property with respect to changes in the shape of the sample with absorbing boundaries; physically, these are boundaries connected to ideal leads. At the transition point, these geometrical objects (pictures) become fractals. Upon combining this restriction property with the expected conformal invariance at the transition point, we employ the mathematical theory of “conformal restriction measures” to relate the disorder-averaged PCCs to correlation functions of (Virasoro) primary operators in a conformal field theory (of central charge c=0). We show how this can be used to calculate these functions in a number of geometries with various boundary conditions. Since our results employ only the conformal restriction property, they are equally applicable to a number of other critical disordered electronic systems in two spatial dimensions, including for example the spin quantum Hall effect, the thermal metal phase in symmetry class D, and classical diffusion in two dimensions in a perpendicular magnetic field. For most of these systems, we also predict exact values of critical exponents related to the spatial behavior of various disorder-averaged PCCs.

  12. Intelligent control based on fuzzy logic and neural net theory

    NASA Technical Reports Server (NTRS)

    Lee, Chuen-Chien

    1991-01-01

    In the conception and design of intelligent systems, one promising direction involves the use of fuzzy logic and neural network theory to enhance such systems' capability to learn from experience and adapt to changes in an environment of uncertainty and imprecision. Here, an intelligent control scheme is explored by integrating these multidisciplinary techniques. A self-learning system is proposed as an intelligent controller for dynamical processes, employing a control policy which evolves and improves automatically. One key component of the intelligent system is a fuzzy logic-based system which emulates human decision making behavior. It is shown that the system can solve a fairly difficult control learning problem. Simulation results demonstrate that improved learning performance can be achieved in relation to previously described systems employing bang-bang control. The proposed system is relatively insensitive to variations in the parameters of the system environment.

  13. Theory for a gas composition sensor based on acoustic properties

    NASA Technical Reports Server (NTRS)

    Phillips, Scott; Dain, Yefim; Lueptow, Richard M.

    2003-01-01

    Sound travelling through a gas propagates at different speeds and its intensity attenuates to different degrees depending upon the composition of the gas. Theoretically, a real-time gaseous composition sensor could be based on measuring the sound speed and the acoustic attenuation. To this end, the speed of sound was modelled using standard relations, and the acoustic attenuation was modelled using the theory for vibrational relaxation of gas molecules. The concept for a gas composition sensor is demonstrated theoretically for nitrogen-methane-water and hydrogen-oxygen-water mixtures. For a three-component gas mixture, the measured sound speed and acoustic attenuation each define separate lines in the composition plane of two of the gases. The intersection of the two lines defines the gas composition. It should also be possible to use the concept for mixtures of more than three components, if the nature of the gas composition is known to some extent.

  14. The Theory of Individual Based Discrete-Time Processes

    NASA Astrophysics Data System (ADS)

    Challenger, Joseph D.; Fanelli, Duccio; McKane, Alan J.

    2014-07-01

    A general theory is developed to study individual based models which are discrete in time. We begin by constructing a Markov chain model that converges to a one-dimensional map in the infinite population limit. Stochastic fluctuations are hence intrinsic to the system and can induce qualitative changes to the dynamics predicted from the deterministic map. From the Chapman-Kolmogorov equation for the discrete-time Markov process, we derive the analogues of the Fokker-Planck equation and the Langevin equation, which are routinely employed for continuous time processes. In particular, a stochastic difference equation is derived which accurately reproduces the results found from the Markov chain model. Stochastic corrections to the deterministic map can be quantified by linearizing the fluctuations around the attractor of the map. The proposed scheme is tested on stochastic models which have the logistic and Ricker maps as their deterministic limits.

  15. Theory for a gas composition sensor based on acoustic properties.

    PubMed

    Phillips, Scott; Dain, Yefim; Lueptow, Richard M

    2003-01-01

    Sound travelling through a gas propagates at different speeds and its intensity attenuates to different degrees depending upon the composition of the gas. Theoretically, a real-time gaseous composition sensor could be based on measuring the sound speed and the acoustic attenuation. To this end, the speed of sound was modelled using standard relations, and the acoustic attenuation was modelled using the theory for vibrational relaxation of gas molecules. The concept for a gas composition sensor is demonstrated theoretically for nitrogen-methane-water and hydrogen-oxygen-water mixtures. For a three-component gas mixture, the measured sound speed and acoustic attenuation each define separate lines in the composition plane of two of the gases. The intersection of the two lines defines the gas composition. It should also be possible to use the concept for mixtures of more than three components, if the nature of the gas composition is known to some extent. PMID:14552356

  16. Correlated digital back propagation based on perturbation theory.

    PubMed

    Liang, Xiaojun; Kumar, Shiva

    2015-06-01

    We studied a simplified digital back propagation (DBP) scheme by including the correlation between neighboring signal samples. An analytical expression for calculating the correlation coefficients is derived based on a perturbation theory. In each propagation step, nonlinear distortion due to phase-dependent terms in the perturbative expansion are ignored which enhances the computational efficiency. The performance of the correlated DBP is evaluated by simulating a single-channel single-polarization fiber-optic system operating at 28 Gbaud, 32-quadrature amplitude modulation (32-QAM), and 40 × 80 km transmission distance. As compared to standard DBP, correlated DBP reduces the total number of propagation steps by a factor of 10 without performance penalty. Correlated DBP with only 2 steps per link provides about one dB improvement in Q-factor over linear compensation. PMID:26072825

  17. A Lie based 4-dimensional higher Chern-Simons theory

    NASA Astrophysics Data System (ADS)

    Zucchini, Roberto

    2016-05-01

    We present and study a model of 4-dimensional higher Chern-Simons theory, special Chern-Simons (SCS) theory, instances of which have appeared in the string literature, whose symmetry is encoded in a skeletal semistrict Lie 2-algebra constructed from a compact Lie group with non discrete center. The field content of SCS theory consists of a Lie valued 2-connection coupled to a background closed 3-form. SCS theory enjoys a large gauge and gauge for gauge symmetry organized in an infinite dimensional strict Lie 2-group. The partition function of SCS theory is simply related to that of a topological gauge theory localizing on flat connections with degree 3 second characteristic class determined by the background 3-form. Finally, SCS theory is related to a 3-dimensional special gauge theory whose 2-connection space has a natural symplectic structure with respect to which the 1-gauge transformation action is Hamiltonian, the 2-curvature map acting as moment map.

  18. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  19. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social

  20. Stress-based elastodynamic discrete laminated plate theory

    NASA Astrophysics Data System (ADS)

    Schoeppner, G. A.; Wolfe, W. E.; Sandhu, R. S.

    1994-03-01

    A static laminated plate theory based on an assumed piecewise linear through-the-thickness in-plane stress distirbution has been extended to include inertia effects. Based on this in-plane stress distribution assumption, out-of-plane shear and normal stress component distributions were derived from the three-dimensional equations of motion, resulting in six nonzero stress components. Hamilton's variational principle was used to derive the plate equations of motion, the plate constitutive relationships, and the interface continuity equations. The governing equations were written in a form that is self-adjoint with respect to the convolution bilinear mapping. The resulting system of equations for a single lamina consists of 25 field equations in terms of 9 weighted displacement field variables, 10 stress and moment resultant field variables, and 6 out-of-plane shear and normal stress boundary field variables. For the laminated system, the mixed formulation enforces both traction and displacement continuity at lamina interfaces a it satisfies layer equilibrium. A finite element formulation based on a specialized form of the governing functional was developed. The method is illustrated with results of a free vibration analysis of sandwich and homogeneous plates for which exact solutions are available.

  1. Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education

    ERIC Educational Resources Information Center

    Amrein-Beardsley, A.; Haladyna, T.

    2012-01-01

    Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…

  2. Image integrity authentication scheme based on fixed point theory.

    PubMed

    Li, Xu; Sun, Xingming; Liu, Quansheng

    2015-02-01

    Based on the fixed point theory, this paper proposes a new scheme for image integrity authentication, which is very different from digital signature and fragile watermarking. By the new scheme, the sender transforms an original image into a fixed point image (very close to the original one) of a well-chosen transform and sends the fixed point image (instead of the original one) to the receiver; using the same transform, the receiver checks the integrity of the received image by testing whether it is a fixed point image and locates the tampered areas if the image has been modified during the transmission. A realization of the new scheme is based on Gaussian convolution and deconvolution (GCD) transform, for which an existence theorem of fixed points is proved. The semifragility is analyzed via commutativity of transforms, and three commutativity theorems are found for the GCD transform. Three iterative algorithms are presented for finding a fixed point image with a few numbers of iterations, and for the whole procedure of image integrity authentication; a fragile authentication system and a semifragile one are separately built. Experiments show that both the systems have good performance in transparence, fragility, security, and tampering localization. In particular, the semifragile system can perfectly resist the rotation by a multiple of 90° flipping and brightness attacks. PMID:25420259

  3. Charge carrier hopping transport based on Marcus theory and variable-range hopping theory in organic semiconductors

    NASA Astrophysics Data System (ADS)

    Lu, Nianduan; Li, Ling; Banerjee, Writam; Sun, Pengxiao; Gao, Nan; Liu, Ming

    2015-07-01

    Charge carrier hopping transport is generally taken from Miller-Abrahams and Marcus transition rates. Based on the Miller-Abrahams theory and nearest-neighbour range hopping theory, Apsley and Hughes developed a concise calculation method (A-H method) to study the hopping conduction in disordered systems. Here, we improve the A-H method to investigate the charge carrier hopping transport by introducing polaron effect and electric field based on Marcus theory and variable-range hopping theory. This improved method can well describe the contribution of polaron effect, energetic disorder, carrier density, and electric field to the charge carrier transport in disordered organic semiconductor. In addition, the calculated results clearly show that the charge carrier mobility represents different polaron effect dependence with the polaron activation energy and decreases with increasing electric field strength for large fields.

  4. Density functional theory based generalized effective fragment potential method.

    PubMed

    Nguyen, Kiet A; Pachter, Ruth; Day, Paul N

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes. PMID:24985612

  5. Physical bases for a triad of biological similarity theories.

    PubMed

    Gunther, B; Morgado, E

    1986-01-01

    The dimensional analysis of physics, based on the MLT-system (M = mass, L = length, T = time), can be applied to the living world, from mycoplasmas (10(-13) g) to the blue whales (10(8) g). Body mass (M), or body weight (W), are utilized as convenient reference systems, since they represent the integrated masses of all elementary particles--at the atomic level--which conform an organism. A triad of biological similarities (mechanical, biological, transport) have been previously described. Each similarity was based on two postulates, of which the first was common to all three, i.e., the constancy of body density; whereas the second postulates were specific for each of the three theories. In this study a physical foundation for these second postulates, based on three universal constants of nature, is presented, these are: 1) the acceleration of gravity (g = LT-2); 2) the velocity of light (c = LT-1); and 3) the mass-specific quantum (h/m = L2T-1). The realm of each of these biological similarities is the following: 1) the gravitational or mechanical similarity (where g = constant), deals mainly with the relationship between a whole organism and its environment, particularly with locomotion. The acceleration of gravity (g) is also one of the determining factors of the "potential" energy (E = m.g.H), where m is the mass, and H is the height above the reference level; 2) the electrodynamic similarity (formerly biological similarity), (c = constant), is able to quantitatively define the internal organization of an organism from both a morphological and a physiological point of view.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:3448994

  6. Density functional theory based generalized effective fragment potential method

    SciTech Connect

    Nguyen, Kiet A. E-mail: ruth.pachter@wpafb.af.mil; Pachter, Ruth E-mail: ruth.pachter@wpafb.af.mil; Day, Paul N.

    2014-06-28

    We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.

  7. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA supervised algorithms. Graphic representation for Shannon's distribution of MD calculating software. PMID:25620721

  8. Prior individual training and self-organized queuing during group emergency escape of mice from water pool.

    PubMed

    Saloma, Caesar; Perez, Gay Jane; Gavile, Catherine Ann; Ick-Joson, Jacqueline Judith; Palmes-Saloma, Cynthia

    2015-01-01

    We study the impact of prior individual training during group emergency evacuation using mice that escape from an enclosed water pool to a dry platform via any of two possible exits. Experimenting with mice avoids serious ethical and legal issues that arise when dealing with unwitting human participants while minimizing concerns regarding the reliability of results obtained from simulated experiments using 'actors'. First, mice were trained separately and their individual escape times measured over several trials. Mice learned quickly to swim towards an exit-they achieved their fastest escape times within the first four trials. The trained mice were then placed together in the pool and allowed to escape. No two mice were permitted in the pool beforehand and only one could pass through an exit opening at any given time. At first trial, groups of trained mice escaped seven and five times faster than their corresponding control groups of untrained mice at pool occupancy rate ρ of 11.9% and 4%, respectively. Faster evacuation happened because trained mice: (a) had better recognition of the available pool space and took shorter escape routes to an exit, (b) were less likely to form arches that blocked an exit opening, and (c) utilized the two exits efficiently without preference. Trained groups achieved continuous egress without an apparent leader-coordinator (self-organized queuing)-a collective behavior not experienced during individual training. Queuing was unobserved in untrained groups where mice were prone to wall seeking, aimless swimming and/or blind copying that produced circuitous escape routes, biased exit use and clogging. The experiments also reveal that faster and less costly group training at ρ = 4%, yielded an average individual escape time that is comparable with individualized training. However, group training in a more crowded pool (ρ = 11.9%) produced a longer average individual escape time. PMID:25693170

  9. Cell division theory and individual-based modeling of microbial lag: part I. The theory of cell division.

    PubMed

    Dens, E J; Bernaerts, K; Standaert, A R; Van Impe, J F

    2005-06-15

    This series of two papers deals with the theory of cell division and its implementation in an individual-based modeling framework. In this first part, the theory of cell division is studied on an individual-based level in order to learn more about the mechanistic principles behind microbial lag phenomena. While some important literature on cell division theory dates from 30 to 40 years ago, until now it has hardly been introduced in the field of predictive microbiology. Yet, it provides a large amount of information on how cells likely respond to changing environmental conditions. On the basis of this theory, a general theory on microbial lag behavior caused by a combination of medium and/or temperature changes has been developed in this paper. The proposed theory then forms the basis for a critical evaluation of existing modeling concepts for microbial lag in predictive microbiology. First of all, a more thorough definition can be formulated to define the lag time lambda and the previously only vaguely defined physiological state of the cells in terms of mechanistically defined parameters like cell mass, RNA or protein content, specific growth rate and time to perform DNA replication and cell division. On the other hand, existing predictive models are evaluated with respect to the newly developed theory. For the model of , a certain fitting parameter can also be related to physically meaningful parameters while for the model of [Augustin, J.-C., Rosso, L., Carlier, V.A. 2000. A model describing the effect of temperature history on lag time for Listeria monocytogenes. Int. J. Food Microbiol. 57, 169-181] a new, mechanistically based, model structure is proposed. A restriction of the proposed theory is that it is only valid for situations where biomass growth responds instantly to an environment change. The authors are aware of the fact that this assumption is not generally acceptable. Lag in biomass can be caused, for example, by a delayed synthesis of some essential growth factor (e.g., enzymes). In the second part of this series of papers [Dens, E.J., Bernaerts, K., Standaert, A.R., Kreft, J.-U., Van Impe, J.F., this issue. Cell division theory and individual-based modeling of microbial lag: part II. Modeling lag phenomena induced by temperature shifts. Int. J. Food Microbiol], the theory of cell division is implemented in an individual-based simulation program and extended to account for lags in biomass growth. In conclusion, the cell division theory applied to microbial populations in dynamic medium and/or temperature conditions provides a useful framework to analyze microbial lag behavior. PMID:15925713

  10. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  11. LSST Telescope Alignment Plan Based on Nodal Aberration Theory

    NASA Astrophysics Data System (ADS)

    Sebag, J.; Gressler, W.; Schmid, T.; Rolland, J. P.; Thompson, K. P.

    2012-04-01

    The optical alignment of the Large Synoptic Survey Telescope (LSST) is potentially challenging, due to its fast three-mirror optical design and its large 3.5 field of view (FOV). It is highly advantageous to align the three-mirror optical system prior to the integration of the complex science camera on the telescope, which corrects the FOV via three refractive elements and includes the operational wavefront sensors. A telescope alignment method based on nodal aberration theory (NAT) is presented here to address this challenge. Without the science camera installed on the telescope, the on-axis imaging performance of the telescope is diffraction-limited, but the field of view is not corrected. The nodal properties of the three-mirror telescope design have been analyzed and an alignment approach has been developed using the intrinsically linear nodal behavior, which is linked via sensitivities to the misalignment parameters. Since mirror figure errors will exist in any real application, a methodology to introduce primary-mirror figure errors into the analysis has been developed and is also presented.

  12. Modeling for Convective Heat Transport Based on Mixing Length Theory

    NASA Astrophysics Data System (ADS)

    Yamagishi, Y.; Yanagisawa, T.

    2002-12-01

    Convection is the most important mechanism for the Earth's internal dynamics, and plays a substantial role on its evolution. On investigating the thermal history of the Earth, convective heat transport should be taken into account. However, it is difficult to treat full convective flow throughout the Earth's entire history. Therefore, the parameterized convection has been developed and widely used. Convection occurring in the Earth's interior has some complicated aspects. It has large variation of viscosity, internal heating, phase boundaries, etc. Especially, the viscosity contrast has significant effect on the efficiency of the heat transport of the convection. The parameterized convection treats viscosity variation artificially, so it has many limitations. We developed an alternative method based on the concept of "mixing length theory". We can relate local thermal gradient with local convective velocity of fluid parcel. Convective heat transport is identified with effective thermal diffusivity, and we can calculate horizontally averaged temperature profile and heat flux by solving a thermal conduction problem. On estimating the parcel's velocity, we can include such as the effect of variable viscosity. In this study, we confirm that the temperature profile can be calculated correctly by this method, on comparing the experimental and 2D calculation results. We further show the effect of the viscosity contrast on the thermal structure of the convective fluid, and calculate the relationship between Nusselt number and modified Rayleigh number.

  13. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.

    PubMed

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  14. A Theory-Based Approach to Restructuring Middle Level Schools.

    ERIC Educational Resources Information Center

    Midgley, Carol; Maehr, Martin L.

    This paper describes the implementation of a reform program in a middle school located in a relatively large school district in southeastern Michigan. First, an integrative theory is presented as a promising framework for reforming middle-grade schools. The theory was developed within a social-cognitive framework that emphasizes the importance of…

  15. An Approach to Theory-Based Youth Programming

    ERIC Educational Resources Information Center

    Duerden, Mat D.; Gillard, Ann

    2011-01-01

    A key but often overlooked aspect of intentional, out-of-school-time programming is the integration of a guiding theoretical framework. The incorporation of theory in programming can provide practitioners valuable insights into essential processes and principles of successful programs. While numerous theories exist that relate to youth development…

  16. Predicting the Number of Public Computer Terminals Needed for an On-Line Catalog: A Queuing Theory Approach.

    ERIC Educational Resources Information Center

    Knox, A. Whitney; Miller, Bruce A.

    1980-01-01

    Describes a method for estimating the number of cathode ray tube terminals needed for public use of an online library catalog. Authors claim method could also be used to estimate needed numbers of microform readers for a computer output microform (COM) catalog. Formulae are included. (Author/JD)

  17. The Development of an Attribution-Based Theory of Motivation: A History of Ideas

    ERIC Educational Resources Information Center

    Weiner, Bernard

    2010-01-01

    The history of ideas guiding the development of an attribution-based theory of motivation is presented. These influences include the search for a "grand" theory of motivation (from drive and expectancy/value theory), an attempt to represent how the past may influence the present and the future (as Thorndike accomplished), and the incorporation of

  18. The Development of an Attribution-Based Theory of Motivation: A History of Ideas

    ERIC Educational Resources Information Center

    Weiner, Bernard

    2010-01-01

    The history of ideas guiding the development of an attribution-based theory of motivation is presented. These influences include the search for a "grand" theory of motivation (from drive and expectancy/value theory), an attempt to represent how the past may influence the present and the future (as Thorndike accomplished), and the incorporation of…

  19. Kinetic theory based modeling of Type II core collapse supernovae

    NASA Astrophysics Data System (ADS)

    Strother, Terrance; Bauer, Wolfgang

    2010-06-01

    Motivated by the success of kinetic theory in the description of observables in intermediate and high energy heavy ion collisions, we use kinetic theory to model the dynamics of core collapse supernovae. The specific way that we employ kinetic theory to solve the relevant transport equations allows us to explicitly model the propagation of neutrinos and a full ensemble of nuclei and treat neutrino-matter interactions in a very general way. With these abilities, our preliminary calculations have observed dynamics that may prove to be an entirely new neutrino capture induced supernova explosion mechanism.

  20. The Application of Carl Rogers' Person-Centered Learning Theory to Web-Based Instruction.

    ERIC Educational Resources Information Center

    Miller, Christopher T.

    This paper provides a review of literature that relates research on Carl Rogers' person-centered learning theory to Web-based learning. Based on the review of the literature, a set of criteria is described that can be used to determine how closely a Web-based course matches the different components of Rogers' person-centered learning theory. Using…

  1. Seismic site-response analysis based on random vibration theory

    NASA Astrophysics Data System (ADS)

    Kang, T.; Jang, H.

    2013-12-01

    Local geology influences earthquake ground motions, which is of importance in specifying ground motion levels for seismic design in practice. This effect is quantified through site response analysis, which involves the propagation of seismic waves from bedrock to the free surface through soft layers. Site response analysis provides a set or several sets of scale factors given as function of frequency at the surface. Empirical characterization of site response requires a large data set over a wide range of magnitudes and distances of events. In reality, especially in low to moderate seismicity regions such as the Korean Peninsula, empirical characterization of site response is not plausible. Thus numerical modeling is only a viable tool for site response in those regions. On the other hand, most of conventional modeling procedures include a step for developing some appropriate synthetic waveforms as input motions to be used in site response analyses. The waveforms are typically synthesized by matching the spectrum, such as uniform hazard response spectrum, on basement rock obtained from the seismic hazard analysis. However, these synthetics are fundamentally problematic in spite of spectral matching because it is based on the amplitude spectrum only without phase information. As an alternative, an approach based on random vibration theory (RVT) is introduced without the need of waveform generations. RVT explains that a given response spectrum can be converted into a power spectrum density function. It is performed in the frequency domain and deals with the statistical representation of responses. It requires the transfer function for the velocity profile of a site. The transfer function is initially developed by computations of receiver functions using the reflectivity method assuming no attenuation for the profile under consideration of various incidence angles. Then the transfer function is iteratively updated with varying attenuation until the results are compatible with the observed modulus and damping which can be obtained through the in-situ or lab tests for the profile. After the final iteration on the transfer function, the maximum amplification responses can be obtained with the extreme values of shear stress and strain on the profile. Thus this approach combines the observational results of material properties with the analytical results based on the reflectivity calculations of a layered structure, which makes it able to estimate site response in reducing unphysical manipulations.

  2. [Peplau's theory of interpersonal relations: an analysis based of Barnum].

    PubMed

    de Almeida, Vitória de Cássia Félix; Lopes, Marcos Venícios de Oliveira; Damasceno, Marta Maria Coelho

    2005-06-01

    The use of theories in Nursing reflects a movement in the profession towards the autonomy and delimitation of its actions. It is, therefore, extremely relevant that theories may be analyzed as for their applicability to practice. The object of this study is to make an analytical-descriptive study of Peplau's Theory of Interpersonal Relations in Nursing from the model of analysis proposed by Barbara Barnum. Among the structural components that may be analyzed in a theory was chosen the element "process", a method recommended for the development of nursing' actions, which was submitted to Barnum's criteria of usefulness. The assessment showed that Peplau's theoretical presuppositions are operational and may serve as a basis in any situation in which nurses communicate and interact with his/her patients. PMID:16060308

  3. Toward A Brain-Based Theory of Beauty

    PubMed Central

    Ishizu, Tomohiro; Zeki, Semir

    2011-01-01

    We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1–9, with 9 being the most beautiful. This allowed us to select three sets of stimuli–beautiful, indifferent and ugly–which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources–musical and visual–and probably by other sources as well. This has led us to formulate a brain-based theory of beauty. PMID:21755004

  4. Quantum mechanical embedding theory based on a unique embedding potential

    SciTech Connect

    Chen Huang; Pavone, Michele; Carter, Emily A.

    2011-04-21

    We remove the nonuniqueness of the embedding potential that exists in most previous quantum mechanical embedding schemes by letting the environment and embedded region share a common embedding (interaction) potential. To efficiently solve for the embedding potential, an optimized effective potential method is derived. This embedding potential, which eschews use of approximate kinetic energy density functionals, is then used to describe the environment while a correlated wavefunction (CW) treatment of the embedded region is employed. We first demonstrate the accuracy of this new embedded CW (ECW) method by calculating the van der Waals binding energy curve between a hydrogen molecule and a hydrogen chain. We then examine the prototypical adsorption of CO on a metal surface, here the Cu(111) surface. In addition to obtaining proper site ordering (top site most stable) and binding energies within this theory, the ECW exhibits dramatic changes in the p-character of the CO 4{sigma} and 5{sigma} orbitals upon adsorption that agree very well with x-ray emission spectra, providing further validation of the theory. Finally, we generalize our embedding theory to spin-polarized quantum systems and discuss the connection between our theory and partition density functional theory.

  5. Enhancing Student Learning in Knowledge-Based Courses: Integrating Team-Based Learning in Mass Communication Theory Classes

    ERIC Educational Resources Information Center

    Han, Gang; Newell, Jay

    2014-01-01

    This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…

  6. Enhancing Student Learning in Knowledge-Based Courses: Integrating Team-Based Learning in Mass Communication Theory Classes

    ERIC Educational Resources Information Center

    Han, Gang; Newell, Jay

    2014-01-01

    This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an

  7. Capacity and delay estimation for roundabouts using conflict theory.

    PubMed

    Qu, Zhaowei; Duan, Yuzhou; Hu, Hongyu; Song, Xianmin

    2014-01-01

    To estimate the capacity of roundabouts more accurately, the priority rank of each stream is determined through the classification technique given in the Highway Capacity Manual 2010 (HCM2010), which is based on macroscopical analysis of the relationship between entry flow and circulating flow. Then a conflict matrix is established using the additive conflict flow method and by considering the impacts of traffic characteristics and limited priority with high volume. Correspondingly, the conflict relationships of streams are built using probability theory. Furthermore, the entry capacity model of roundabouts is built, and sensitivity analysis is conducted on the model parameters. Finally, the entrance delay model is derived using queuing theory, and the proposed capacity model is compared with the model proposed by Wu and that in the HCM2010. The results show that the capacity calculated by the proposed model is lower than the others for an A-type roundabout, while it is basically consistent with the estimated values from HCM2010 for a B-type roundabout. PMID:24982982

  8. Capacity and Delay Estimation for Roundabouts Using Conflict Theory

    PubMed Central

    Qu, Zhaowei; Duan, Yuzhou; Hu, Hongyu; Song, Xianmin

    2014-01-01

    To estimate the capacity of roundabouts more accurately, the priority rank of each stream is determined through the classification technique given in the Highway Capacity Manual 2010 (HCM2010), which is based on macroscopical analysis of the relationship between entry flow and circulating flow. Then a conflict matrix is established using the additive conflict flow method and by considering the impacts of traffic characteristics and limited priority with high volume. Correspondingly, the conflict relationships of streams are built using probability theory. Furthermore, the entry capacity model of roundabouts is built, and sensitivity analysis is conducted on the model parameters. Finally, the entrance delay model is derived using queuing theory, and the proposed capacity model is compared with the model proposed by Wu and that in the HCM2010. The results show that the capacity calculated by the proposed model is lower than the others for an A-type roundabout, while it is basically consistent with the estimated values from HCM2010 for a B-type roundabout. PMID:24982982

  9. How Is a Science Lesson Developed and Implemented Based on Multiple Intelligences Theory?

    ERIC Educational Resources Information Center

    Kaya, Osman Nafiz

    2008-01-01

    The purpose of this study is to present the whole process step-by-step of how a science lesson can be planned and implemented based on Multiple Intelligences (MI) theory. First, it provides the potential of the MI theory for science teaching and learning. Then an MI science lesson that was developed based on a modified model in the literature and…

  10. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert…

  11. Development and Evaluation of a Theory-Based Physical Activity Guidebook for Breast Cancer Survivors

    ERIC Educational Resources Information Center

    Vallance, Jeffrey K.; Courneya, Kerry S.; Taylor, Lorian M.; Plotnikoff, Ronald C.; Mackey, John R.

    2008-01-01

    This study's objective was to develop and evaluate the suitability and appropriateness of a theory-based physical activity (PA) guidebook for breast cancer survivors. Guidebook content was constructed based on the theory of planned behavior (TPB) using salient exercise beliefs identified by breast cancer survivors in previous research. Expert

  12. A Theory-Driven Integrative Process/Outcome Evaluation of a Concept-Based Nursing Curriculum

    ERIC Educational Resources Information Center

    Fromer, Rosemary F.

    2013-01-01

    The current trend in curriculum revision in nursing education is concept-based learning, but little research has been done on concept-based curricula in nursing education. The study used a theory-driven integrative process/outcome evaluation. Embedded in this theory-driven integrative process/outcome evaluation was a causal comparative…

  13. Logical Thinking in Children; Research Based on Piaget's Theory.

    ERIC Educational Resources Information Center

    Sigel, Irving E., Ed.; Hooper, Frank H., Ed.

    Theoretical and empirical research derived from Piagetian theory is collected on the intellectual development of the elementary school child and his acquisition and utilization of conservation concepts. The articles present diversity of method and motive in the results of replication (validation studies of the description of cognitive growth) and…

  14. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Mndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are

  15. Automatic Trading Agent. RMT Based Portfolio Theory and Portfolio Selection

    NASA Astrophysics Data System (ADS)

    Snarska, M.; Krzych, J.

    2006-11-01

    Portfolio theory is a very powerful tool in the modern investment theory. It is helpful in estimating risk of an investor's portfolio, arosen from lack of information, uncertainty and incomplete knowledge of reality, which forbids a perfect prediction of future price changes. Despite of many advantages this tool is not known and not widely used among investors on Warsaw Stock Exchange. The main reason for abandoning this method is a high level of complexity and immense calculations. The aim of this paper is to introduce an automatic decision-making system, which allows a single investor to use complex methods of Modern Portfolio Theory (MPT). The key tool in MPT is an analysis of an empirical covariance matrix. This matrix, obtained from historical data, biased by such a high amount of statistical uncertainty, that it can be seen as random. By bringing into practice the ideas of Random Matrix Theory (RMT), the noise is removed or significantly reduced, so the future risk and return are better estimated and controlled. These concepts are applied to the Warsaw Stock Exchange Simulator {http://gra.onet.pl}. The result of the simulation is 18% level of gains in comparison with respective 10% loss of the Warsaw Stock Exchange main index WIG.

  16. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  17. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Méndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

  18. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key

  19. Stability Analysis for Car Following Model Based on Control Theory

    NASA Astrophysics Data System (ADS)

    Meng, Xiang-Pei; Li, Zhi-Peng; Ge, Hong-Xia

    2014-05-01

    Stability analysis is one of the key issues in car-following theory. The stability analysis with Lyapunov function for the two velocity difference car-following model (for short, TVDM) is conducted and the control method to suppress traffic congestion is introduced. Numerical simulations are given and results are consistent with the theoretical analysis.

  20. Effective Contraceptive Use: An Exploration of Theory-Based Influences

    ERIC Educational Resources Information Center

    Peyman, N.; Oakley, D.

    2009-01-01

    The purpose of this study was to explore factors that influence oral contraceptive (OC) use among women in Iran using the Theory of Planned Behavior (TPB) and concept of self-efficacy (SE). The study sample consisted of 360 married OC users, aged 18-49 years recruited at public health centers of Mashhad, 900 km east of Tehran. SE had the strongest…

  1. Course Management and Students' Expectations: Theory-Based Considerations

    ERIC Educational Resources Information Center

    Buckley, M. Ronald; Novicevic, Milorad M.; Halbesleben, Jonathon R. B.; Harvey, Michael

    2004-01-01

    This paper proposes a theoretical, yet practical, framework for managing the formation process of students unrealistic expectations in a college course. Using relational contracting theory, alternative teacher interventions, aimed at effective management of students expectations about the course, are described. Also, the formation of the student…

  2. Fragment-Based Time-Dependent Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Mosquera, Martín A.; Jensen, Daniel; Wasserman, Adam

    2013-07-01

    Using the Runge-Gross theorem that establishes the foundation of time-dependent density functional theory, we prove that for a given electronic Hamiltonian, choice of initial state, and choice of fragmentation, there is a unique single-particle potential (dubbed time-dependent partition potential) which, when added to each of the preselected fragment potentials, forces the fragment densities to evolve in such a way that their sum equals the exact molecular density at all times. This uniqueness theorem suggests new ways of computing the time-dependent properties of electronic systems via fragment-time-dependent density functional theory calculations. We derive a formally exact relationship between the partition potential and the total density, and illustrate our approach on a simple model system for binary fragmentation in a laser field.

  3. Local control theory in trajectory-based nonadiabatic dynamics

    SciTech Connect

    Curchod, Basile F. E.; Penfold, Thomas J.; Rothlisberger, Ursula; Tavernelli, Ivano

    2011-10-15

    In this paper, we extend the implementation of nonadiabatic molecular dynamics within the framework of time-dependent density-functional theory in an external field described in Tavernelli et al.[Phys. Rev. A 81, 052508 (2010)] by calculating on-the-fly pulses to control the population transfer between electronic states using local control theory. Using Tully's fewest switches trajectory surface hopping method, we perform MD to control the photoexcitation of LiF and compare the results to quantum dynamics (QD) calculations performed within the Heidelberg multiconfiguration time-dependent Hartree package. We show that this approach is able to calculate a field that controls the population transfer between electronic states. The calculated field is in good agreement with that obtained from QD, and the differences that arise are discussed in detail.

  4. Circuit theory and model-based inference for landscape connectivity

    USGS Publications Warehouse

    Hanks, Ephraim M.; Hooten, Mevin B.

    2013-01-01

    Circuit theory has seen extensive recent use in the field of ecology, where it is often applied to study functional connectivity. The landscape is typically represented by a network of nodes and resistors, with the resistance between nodes a function of landscape characteristics. The effective distance between two locations on a landscape is represented by the resistance distance between the nodes in the network. Circuit theory has been applied to many other scientific fields for exploratory analyses, but parametric models for circuits are not common in the scientific literature. To model circuits explicitly, we demonstrate a link between Gaussian Markov random fields and contemporary circuit theory using a covariance structure that induces the necessary resistance distance. This provides a parametric model for second-order observations from such a system. In the landscape ecology setting, the proposed model provides a simple framework where inference can be obtained for effects that landscape features have on functional connectivity. We illustrate the approach through a landscape genetics study linking gene flow in alpine chamois (Rupicapra rupicapra) to the underlying landscape.

  5. Gravitational Cherenkov losses in theories based on modified Newtonian dynamics.

    PubMed

    Milgrom, Mordehai

    2011-03-18

    Survival of high-energy cosmic rays (HECRs) against gravitational Cherenkov losses is shown not to cast strong constraints on modified Newtonian dynamics (MOND) theories that are compatible with general relativity (GR): theories that coincide with GR for accelerations ≫a(0) (a(0) is the MOND constant). The energy-loss rate, E, is many orders smaller than those derived in the literature for theories with no extra scale. Modification to GR, which underlies E, enters only beyond the MOND radius of the particle: r(M)=(Gp/ca(0))(1/2). The spectral cutoff, entering E quadratically, is thus r(M)(-1), not k(dB)=p/ℏ. Thus, E is smaller than published rates, which use k(dB), by a factor ∼(r(M)k(dB))(2)≈10(39)(cp/3×10(11)  Gev)(3). Losses are important only beyond D(loss)≈qℓ(M), where q is a dimensionless factor, and ℓ(M)=c(2)/a(0) is the MOND length, which is ≈2π times the Hubble distance. PMID:21469855

  6. The viscoplasticity theory based on overstress applied to the modeling of a nickel base superalloy at 815 C

    NASA Technical Reports Server (NTRS)

    Krempl, E.; Lu, H.; Yao, D.

    1988-01-01

    Short term strain rate change, creep and relaxation tests were performed in an MTS computer controlled servohydraulic testing machine. Aging and recovery were found to be insignificant for test times not exceeding 30 hrs. The material functions and constants of the theory were identified from results of strain rate change tests. Numerical integration of the theory for relaxation and creep tests showed good predictive capabilities of the viscoplasticity theory based on overstress.

  7. Cis control of gene expression in E.coli by ribosome queuing at an inefficient translational stop signal

    PubMed Central

    Jin, Haining; Björnsson, Asgeir; Isaksson, Leif A.

    2002-01-01

    An UGA stop codon context which is inefficient because of the 3′-flanking context and the last two amino acids in the gene protein product has a negative effect on gene expression, as shown using a model protein A′ gene. This is particularly true at low mRNA levels, corresponding to a high intracellular ribosome/mRNA ratio. The negative effect is smaller if this ratio is decreased, or if the distance between the initiation and termination signals is increased. The results suggest that an inefficient termination codon can cause ribosomal pausing and queuing along the upstream mRNA region, thus blocking translation initiation of short genes. This cis control effect is dependent on the stop codon context, including the C-terminal amino acids in the gene product, the translation initiation signal strength, the ribosome/mRNA ratio and the size of the mRNA coding region. A large proportion of poorly expressed natural Escherichia coli genes are small, and the weak termination codon UGA is under-represented in small, highly expressed E.coli genes as compared with the efficient stop codon UAA. PMID:12169638

  8. Collective learning modeling based on the kinetic theory of active particles

    NASA Astrophysics Data System (ADS)

    Burini, D.; De Lillo, S.; Gibelli, L.

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.

  9. Collective learning modeling based on the kinetic theory of active particles.

    PubMed

    Burini, D; De Lillo, S; Gibelli, L

    2016-03-01

    This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom. PMID:26542123

  10. Theory of Spike Timing-Based Neural Classifiers

    NASA Astrophysics Data System (ADS)

    Rubin, Ran; Monasson, Rémi; Sompolinsky, Haim

    2010-11-01

    We study the computational capacity of a model neuron, the tempotron, which classifies sequences of spikes by linear-threshold operations. We use statistical mechanics and extreme value theory to derive the capacity of the system in random classification tasks. In contrast with its static analog, the perceptron, the tempotron’s solutions space consists of a large number of small clusters of weight vectors. The capacity of the system per synapse is finite in the large size limit and weakly diverges with the stimulus duration relative to the membrane and synaptic time constants.

  11. A precepted leadership course based on Bandura's social learning theory.

    PubMed

    Haddock, K S

    1994-01-01

    Transition from student to registered nurse (RN) has long been cited as a difficult time for new graduates entering health care. Bandura's (1977) theory of social learning guided a revision of a nursing leadership course required of baccalaureate student nurses (BSNs) in their final semester. The preceptorship allowed students to work closely with and to practice modeled behaviors of RNs and then receive feedback and reinforcement from both the preceptor and the supervising faculty member. Students were thus prepared to function better in the reality of the practice setting. Positive outcomes were experienced by students, BSN preceptors, faculty, and nurse administrators. PMID:7997295

  12. Patient and nurse experiences of theory-based care.

    PubMed

    Flanagan, Jane

    2009-04-01

    The pre-surgery nursing practice model derived from Newman's theory was developed to change the delivery of nursing care in a pre-surgical clinic. Guided by the theoretical knowledge of health as expanding consciousness, transpersonal caring, and reflective practice, key practice changes included a) incorporating Newman's praxis process, b) changing the physical space, and c) providing opportunities to reflect on practice. The purpose of this study was to utilize a phenomenological approach to evaluate a new model of care among 31 patients and 4 nurses. PMID:19342715

  13. Interactive Image Segmentation Framework Based On Control Theory

    PubMed Central

    Zhu, Liangjia; Kolesov, Ivan; Karasev, Peter; Tannenbaum, Allen

    2016-01-01

    Segmentation of anatomical structures in medical imagery is a key step in a variety of clinical applications. Designing a generic, automated method that works for various structures and imaging modalities is a daunting task. Instead of proposing a new specific segmentation algorithm, in this paper, we present a general design principle on how to integrate user interactions from the perspective of control theory. In this formulation, Lyapunov stability analysis is employed to design and analyze an interactive segmentation system. The effectiveness and robustness of the proposed method are demonstrated. PMID:26900204

  14. Interactive image segmentation framework based on control theory

    NASA Astrophysics Data System (ADS)

    Zhu, Liangjia; Kolesov, Ivan; Ratner, Vadim; Karasev, Peter; Tannenbaum, Allen

    2015-03-01

    Segmentation of anatomical structures in medical imagery is a key step in a variety of clinical applications. Designing a generic, automated method that works for various structures and imaging modalities is a daunting task. Instead of proposing a new specific segmentation algorithm, in this paper, we present a general design principle on how to integrate user interactions from the perspective of control theory. In this formulation, Lyapunov stability analysis is employed to design an interactive segmentation system. The effectiveness and robustness of the proposed method are demonstrated.

  15. Game theory based band selection for hyperspectral images

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; He, Zhenyu; Huang, Fengchen

    2015-12-01

    This paper proposes a new evaluation criterion for band selection for hyperspectral imagery. The combination of information and class separability is used to be as a new evaluation criterion, at the same time, the correlation between bands is used as a constraint condition. In addition, the game theory is introduced into the band selection to coordinate the potential conflict of search the optimal band combination using information and class separability these two evaluation criteria. The experimental results show that the proposed method is effective on AVIRIS hyperspectral data.

  16. A theory-based approach to teaching young children about health: A recipe for understanding.

    PubMed

    Nguyen, Simone P; McCullough, Mary Beth; Noble, Ashley

    2011-08-01

    The theory-theory account of conceptual development posits that children's concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children's learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children's learning of complex, real-world concepts. PMID:21894237

  17. A theory-based approach to teaching young children about health: A recipe for understanding

    PubMed Central

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237

  18. A Model of Rater Behavior in Essay Grading Based on Signal Detection Theory

    ERIC Educational Resources Information Center

    DeCarlo, Lawrence T.

    2005-01-01

    An approach to essay grading based on signal detection theory (SDT) is presented. SDT offers a basis for understanding rater behavior with respect to the scoring of construct responses, in that it provides a theory of psychological processes underlying the raters' behavior. The approach also provides measures of the precision of the raters and the…

  19. The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation

    ERIC Educational Resources Information Center

    Moorer, Cleamon, Jr.

    2014-01-01

    This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…

  20. Social Learning Theory Parenting Intervention Promotes Attachment-Based Caregiving in Young Children: Randomized Clinical Trial

    ERIC Educational Resources Information Center

    O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

    2013-01-01

    Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social…

  1. Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory

    ERIC Educational Resources Information Center

    Johnson, David W.; Johnson, Roger T.; Smith, Karl A.

    2014-01-01

    Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…

  2. Social Learning Theory Parenting Intervention Promotes Attachment-Based Caregiving in Young Children: Randomized Clinical Trial

    ERIC Educational Resources Information Center

    O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

    2013-01-01

    Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social

  3. Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact

    ERIC Educational Resources Information Center

    McQuillin, Samuel D.; Lyons, Michael D.

    2016-01-01

    This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…

  4. The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation

    ERIC Educational Resources Information Center

    Moorer, Cleamon, Jr.

    2014-01-01

    This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,

  5. Curriculum Design for Junior Life Sciences Based Upon the Theories of Piaget and Skiller. Final Report.

    ERIC Educational Resources Information Center

    Pearce, Ella Elizabeth

    Four seventh grade life science classes, given curriculum materials based upon Piagetian theories of intellectual development and Skinner's theories of secondary reinforcement, were compared with four control classes from the same school districts. Nine students from each class, who(at the pretest) were at the concrete operations stage of…

  6. A reactive mobile robot based on a formal theory of action

    SciTech Connect

    Baral, C. Floriano, L.; Gabaldon, A.

    1996-12-31

    One of the agenda behind research in reasoning about actions is to develop autonomous agents (robots) that can act in a dynamic world. The early attempts to use theories of reasoning about actions and planning to formulate a robot control architecture were not successful for several reasons: The early theories based on STRIPS and its extensions allowed only observations about the initial state. A robot control architecture using these theories was usually of the form: (i) make observations (ii) Use the action theory to construct a plan to achieve the goal, and (iii) execute the plan.

  7. Determination of the Sediment Carrying Capacity Based on Perturbed Theory

    PubMed Central

    Ni, Zhi-hui; Zeng, Qiang; Li-chun, Wu

    2014-01-01

    According to the previous studies of sediment carrying capacity, a new method of sediment carrying capacity on perturbed theory was proposed. By taking into account the average water depth, average flow velocity, settling velocity, and other influencing factors and introducing the median grain size as one main influencing factor in deriving the new formula, we established a new sediment carrying capacity formula. The coefficients were determined by the principle of dimensional analysis, multiple linear regression method, and the least square method. After that, the new formula was verified through measuring data of natural rivers and flume tests and comparing the verified results calculated by Cao Formula, Zhang Formula, Li Formula, Engelung-Hansen Formula, Ackers-White Formula, and Yang Formula. According to the compared results, it can be seen that the new method is of high accuracy. It could be a useful reference for the determination of sediment carrying capacity. PMID:25136652

  8. Lens-based theory of the Lau effect.

    PubMed

    Kolodziejczyk, A; Jaroszewicz, Z; Henao, R

    2000-04-01

    A new theoretical model of the Lau effect is presented. The transmittance of a diffraction grating can be expressed in an equivalent form as the sum of transmittances of thin cylindrical lenses. Therefore it is possible to explain the Lau effect on the basis of the well-known imaging properties of lenses. According to the given approach, the Lau fringes are created by overlapped images of the first grating that are formed by a set of lenses corresponding to the second grating in the setup. The theory leads to an exhaustive description of the Lau-effect parameters. In particular, one can indicate the shape of the Lau fringes and localize planes of the fringes dependent on the axial distance between gratings and their periods. PMID:10757179

  9. Effects of a social cognitive theory-based hip fracture prevention web site for older adults.

    PubMed

    Nahm, Eun-Shim; Barker, Bausell; Resnick, Barbara; Covington, Barbara; Magaziner, Jay; Brennan, Patricia Flatley

    2010-01-01

    The purposes of this study were to develop a Social Cognitive Theory-based, structured Hip Fracture Prevention Web site for older adults and conduct a preliminary evaluation of its effectiveness. The Theory-based, structured Hip Fracture Prevention Web site is composed of learning modules and a moderated discussion board. A total of 245 older adults recruited from two Web sites and a newspaper advertisement were randomized into the Theory-based, structured Hip Fracture Prevention Web site and the conventional Web sites groups. Outcomes included (1) knowledge (hip fractures and osteoporosis), (2) self-efficacy and outcome expectations, and (3) calcium intake and exercise and were assessed at baseline, end of treatment (2 weeks), and follow-up (3 months). Both groups showed significant improvement in most outcomes. For calcium intake, only the Theory-based, structured Hip Fracture Prevention Web site group showed improvement. None of the group and time interactions were significant. The Theory-based, structured Hip Fracture Prevention Web site group, however, was more satisfied with the intervention. The discussion board usage was significantly correlated with outcome gains. Despite several limitations, the findings showed some preliminary effectiveness of Web-based health interventions for older adults and the use of a Theory-based, structured Hip Fracture Prevention Web site as a sustainable Web structure for online health behavior change interventions. PMID:20978408

  10. Testing Expectancy Theory Predictions Using Behaviorally Based Measures of Motivational Effort for Engineers

    ERIC Educational Resources Information Center

    Arvey, Richard D.; Neel, C. Warren

    1974-01-01

    Expectancy theory predictions were tested using a sample of engineers who had been rated on dimensions of work motivation or effort (in contrast to performance) using the behaviorally based rating scales designed by Landy and Guion (1970). (Author)

  11. An anti-attack model based on complex network theory in P2P networks

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Lu, Songnian; Zhao, Dandan; Zhang, Aixin; Li, Jianhua

    2012-04-01

    Complex network theory is a useful way to study many real systems. In this paper, an anti-attack model based on complex network theory is introduced. The mechanism of this model is based on a dynamic compensation process and a reverse percolation process in P2P networks. The main purpose of the paper is: (i) a dynamic compensation process can turn an attacked P2P network into a power-law (PL) network with exponential cutoff; (ii) a local healing process can restore the maximum degree of peers in an attacked P2P network to a normal level; (iii) a restoring process based on reverse percolation theory connects the fragmentary peers of an attacked P2P network together into a giant connected component. In this way, the model based on complex network theory can be effectively utilized for anti-attack and protection purposes in P2P networks.

  12. Interlaminar Stresses by Refined Beam Theories and the Sinc Method Based on Interpolation of Highest Derivative

    NASA Technical Reports Server (NTRS)

    Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander

    2010-01-01

    Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.

  13. Improved routing strategy based on gravitational field theory

    NASA Astrophysics Data System (ADS)

    Song, Hai-Quan; Guo, Jin

    2015-10-01

    Routing and path selection are crucial for many communication and logistic applications. We study the interaction between nodes and packets and establish a simple model for describing the attraction of the node to the packet in transmission process by using the gravitational field theory, considering the real and potential congestion of the nodes. On the basis of this model, we propose a gravitational field routing strategy that considers the attractions of all of the nodes on the travel path to the packet. In order to illustrate the efficiency of proposed routing algorithm, we introduce the order parameter to measure the throughput of the network by the critical value of phase transition from a free flow phase to a congested phase, and study the distribution of betweenness centrality and traffic jam. Simulations show that, compared with the shortest path routing strategy, the gravitational field routing strategy considerably enhances the throughput of the network and balances the traffic load, and nearly all of the nodes are used efficiently. Project supported by the Technology and Development Research Project of China Railway Corporation (Grant No. 2012X007-D) and the Key Program of Technology and Development Research Foundation of China Railway Corporation (Grant No. 2012X003-A).

  14. Theory and practical based approach to chronic total occlusions.

    PubMed

    Sianos, Georgios; Konstantinidis, Nikolaos V; Di Mario, Carlo; Karvounis, Haralambos

    2016-01-01

    Coronary chronic total occlusions (CTOs) represent the most technically challenging lesion subset that interventional cardiologists face. CTOs are identified in up to one third of patients referred for coronary angiography and remain seriously undertreated with percutaneous techniques. The complexity of these procedures and the suboptimal success rates over a long period of time, along with the perception that CTOs are lesions with limited scope for recanalization, account for the underutilization of CTO Percutaneous Coronary Intervention (PCI). During the last years, dedicated groups of experts in Japan, Europe and United States fostered the development and standardization of modern CTO recanalization techniques, achieving success rates far beyond 90%, while coping with lesions of increasing complexity. Numerous studies support the rationale of CTO revascularization following documentation of viability and ischemia in the territory distal to the CTO. Successful CTO PCI provide better tolerance in case of future acute coronary syndromes and can significantly improve angina and left ventricular function. Randomized trials are on the way to further explore the prognostic benefit of CTO revascularization. The following review reports on the theory and the most recent advances in the field of CTO recanalization, in an attempt to promote a more balanced approach in patients with chronically occluded coronary arteries. PMID:26860695

  15. Treatment motivation in drug users: a theory-based analysis.

    PubMed

    Longshore, Douglas; Teruya, Cheryl

    2006-02-01

    Motivation for drug use treatment is widely regarded as crucial to a client's engagement in treatment and success in quitting drug use. Motivation is typically measured with items reflecting high treatment readiness (e.g., perceived need for treatment and commitment to participate) and low treatment resistance (e.g., skepticism regarding benefits of treatment). Building upon reactance theory and the psychotherapeutic construct of resistance, we conceptualized these two aspects of treatment motivation - readiness and resistance - as distinct constructs and examined their predictive power in a sample of 1295 drug-using offenders referred to treatment while on probation. The sample was 60.7% African Americans, 33.5% non-Hispanic Whites, and 21.2% women; their ages ranged from 16 to 63 years old. Interviews occurred at treatment entry and 6 months later. Readiness (but not resistance) predicted treatment retention during the 6-month period. Resistance (but not readiness) predicted drug use, especially among offenders for whom the treatment referral was coercive. These findings suggest that readiness and resistance should both be assessed among clients entering treatment, especially when the referral is coercive. Intake and counseling protocols should address readiness and resistance separately. PMID:16051447

  16. Energy-Efficiency Analysis of a Distributed Queuing Medium Access Control Protocol for Biomedical Wireless Sensor Networks in Saturation Conditions

    PubMed Central

    Otal, Begonya; Alonso, Luis; Verikoukis, Christos

    2011-01-01

    The aging population and the high quality of life expectations in our society lead to the need of more efficient and affordable healthcare solutions. For this reason, this paper aims for the optimization of Medium Access Control (MAC) protocols for biomedical wireless sensor networks or wireless Body Sensor Networks (BSNs). The hereby presented schemes always have in mind the efficient management of channel resources and the overall minimization of sensors’ energy consumption in order to prolong sensors’ battery life. The fact that the IEEE 802.15.4 MAC does not fully satisfy BSN requirements highlights the need for the design of new scalable MAC solutions, which guarantee low-power consumption to the maximum number of body sensors in high density areas (i.e., in saturation conditions). In order to emphasize IEEE 802.15.4 MAC limitations, this article presents a detailed overview of this de facto standard for Wireless Sensor Networks (WSNs), which serves as a link for the introduction and initial description of our here proposed Distributed Queuing (DQ) MAC protocol for BSN scenarios. Within this framework, an extensive DQ MAC energy-consumption analysis in saturation conditions is presented to be able to evaluate its performance in relation to IEEE 802.5.4 MAC in highly dense BSNs. The obtained results show that the proposed scheme outperforms IEEE 802.15.4 MAC in average energy consumption per information bit, thus providing a better overall performance that scales appropriately to BSNs under high traffic conditions. These benefits are obtained by eliminating back-off periods and collisions in data packet transmissions, while minimizing the control overhead. PMID:22319351

  17. Designing Site-Based Systems, Deriving a Theory of Practice.

    ERIC Educational Resources Information Center

    Bauer, Scott C.

    1998-01-01

    Reviews five dimensions (focus, scope, structure, process, and capacity) of an organizational design used by 20 New York districts planning for site-based management (SBM) implementation. The confusion surrounding devolution of decision making hinders districts' efforts to effect changes in intermediate variables (job satisfaction and staff…

  18. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image

  19. A Conceptual Framework Based on Activity Theory for Mobile CSCL

    ERIC Educational Resources Information Center

    Zurita, Gustavo; Nussbaum, Miguel

    2007-01-01

    There is a need for collaborative group activities that promote student social interaction in the classroom. Handheld computers interconnected by a wireless network allow people who work on a common task to interact face to face while maintaining the mediation afforded by a technology-based system. Wirelessly interconnected handhelds open up new…

  20. An Efficacious Theory-Based Intervention for Stepfamilies

    ERIC Educational Resources Information Center

    Forgatch, Marion S.; DeGarmo, David S.; Beldavs, Zintars G.

    2005-01-01

    This article evaluates the efficacy of the Oregon model of Parent Management Training (PMTO) in the stepfamily context. Sixty-seven of 110 participants in the Marriage and Parenting in Stepfamilies (MAPS) program received a PMTO-based intervention. Participants in the randomly assigned experimental group displayed a large effect in benefits to…

  1. Integrated Models of School-Based Prevention: Logic and Theory

    ERIC Educational Resources Information Center

    Domitrovich, Celene E.; Bradshaw, Catherine P.; Greenberg, Mark T.; Embry, Dennis; Poduska, Jeanne M.; Ialongo, Nicholas S.

    2010-01-01

    School-based prevention programs can positively impact a range of social, emotional, and behavioral outcomes. Yet the current climate of accountability pressures schools to restrict activities that are not perceived as part of the core curriculum. Building on models from public health and prevention science, we describe an integrated approach to…

  2. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  3. Cost performance satellite design using queueing theory. [via digital simulation

    NASA Technical Reports Server (NTRS)

    Hein, G. F.

    1975-01-01

    A modified Poisson arrival, infinite server queuing model is used to determine the effects of limiting the number of broadcast channels (C) of a direct broadcast satellite used for public service purposes (remote health care, education, etc.). The model is based on the reproductive property of the Poisson distribution. A difference equation has been developed to describe the change in the Poisson parameter. When all initially delayed arrivals reenter the system a (C plus 1) order polynomial must be solved to determine the effective value of the Poisson parameter. When less than 100% of the arrivals reenter the system the effective value must be determined by solving a transcendental equation. The model was used to determine the minimum number of channels required for a disaster warning satellite without degradation in performance. Results predicted by the queuing model were compared with the results of digital simulation.

  4. Scale-invariant entropy-based theory for dynamic ordering.

    PubMed

    Mahulikar, Shripad P; Kumari, Priti

    2014-09-01

    Dynamically Ordered self-organized dissipative structure exists in various forms and at different scales. This investigation first introduces the concept of an isolated embedding system, which embeds an open system, e.g., dissipative structure and its mass and/or energy exchange with its surroundings. Thereafter, scale-invariant theoretical analysis is presented using thermodynamic principles for Order creation, existence, and destruction. The sustainability criterion for Order existence based on its structured mass and/or energy interactions with the surroundings is mathematically defined. This criterion forms the basis for the interrelationship of physical parameters during sustained existence of dynamic Order. It is shown that the sufficient condition for dynamic Order existence is approached if its sustainability criterion is met, i.e., its destruction path is blocked. This scale-invariant approach has the potential to unify the physical understanding of universal dynamic ordering based on entropy considerations. PMID:25273200

  5. Scale-invariant entropy-based theory for dynamic ordering

    SciTech Connect

    Mahulikar, Shripad P. E-mail: spm@aero.iitb.ac.in; Kumari, Priti

    2014-09-01

    Dynamically Ordered self-organized dissipative structure exists in various forms and at different scales. This investigation first introduces the concept of an isolated embedding system, which embeds an open system, e.g., dissipative structure and its mass and/or energy exchange with its surroundings. Thereafter, scale-invariant theoretical analysis is presented using thermodynamic principles for Order creation, existence, and destruction. The sustainability criterion for Order existence based on its structured mass and/or energy interactions with the surroundings is mathematically defined. This criterion forms the basis for the interrelationship of physical parameters during sustained existence of dynamic Order. It is shown that the sufficient condition for dynamic Order existence is approached if its sustainability criterion is met, i.e., its destruction path is blocked. This scale-invariant approach has the potential to unify the physical understanding of universal dynamic ordering based on entropy considerations.

  6. INTEGRATED MODELS OF SCHOOL-BASED PREVENTION: LOGIC AND THEORY

    PubMed Central

    DOMITROVICH, CELENE E.; BRADSHAW, CATHERINE P.; GREENBERG, MARK T.; EMBRY, DENNIS; PODUSKA, JEANNE M.; IALONGO, NICHOLAS S.

    2011-01-01

    School-based prevention programs can positively impact a range of social, emotional, and behavioral outcomes. Yet the current climate of accountability pressures schools to restrict activities that are not perceived as part of the core curriculum. Building on models from public health and prevention science, we describe an integrated approach to school-based prevention. These models leverage the most effective structural and content components of social-emotional and behavioral health prevention interventions. Integrated interventions are expected to have additive and synergistic effects that result in greater impacts on multiple student outcomes. Integrated programs are also expected to be more efficient to deliver, easier to implement with high quality and integrity, and more sustainable. We provide a detailed example of the process through which the PAX-Good Behavior Game and the Promoting Alternative Thinking Strategies (PATHS) curriculum were integrated into the PATHS to PAX model. Implications for future research are proposed.

  7. Microscopic theory of anomalous diffusion based on particle interactions.

    PubMed

    Lutsko, James F; Boon, Jean Pierre

    2013-08-01

    We present a master equation formulation based on a Markovian random walk model that exhibits subdiffusion, classical diffusion, and superdiffusion as a function of a single parameter. The nonclassical diffusive behavior is generated by allowing for interactions between a population of walkers. At the macroscopic level, this gives rise to a nonlinear Fokker-Planck equation. The diffusive behavior is reflected not only in the mean squared displacement [~t(γ) with 0<γ≤1.5] but also in the existence of self-similar scaling solutions of the Fokker-Planck equation. We give a physical interpretation of sub- and superdiffusion in terms of the attractive and repulsive interactions between the diffusing particles and we discuss analytically the limiting values of the exponent γ. Simulations based on the master equation are shown to be in agreement with the analytical solutions of the nonlinear Fokker-Planck equation in all three diffusion regimes. PMID:24032776

  8. A Proxy Signature Scheme Based on Coding Theory

    NASA Astrophysics Data System (ADS)

    Jannati, Hoda; Falahati, Abolfazl

    Proxy signature helps the proxy signer to sign messages on behalf of the original signer. This signature is used when the original signer is not available to sign a specific document. In this paper, we introduce a new proxy signature scheme based on Stern's identification scheme whose security depends on syndrome decoding problem. The proposed scheme is the first code-based proxy signature and can be used in a quantum computer. In this scheme, the operations to perform are linear and very simple thus the signature is performed quickly and can be implemented using smart card in a quite efficient way. The proposed scheme also satisfies unforgeability, undeniability, non-transferability and distinguishability properties which are the security requirements for a proxy signature.

  9. Research on e-learning services based on ontology theory

    NASA Astrophysics Data System (ADS)

    Liu, Rui

    2013-07-01

    E-learning services can realize network learning resource sharing and interoperability, but they can't realize automatic discovery, implementation and integration of services. This paper proposes a framework of e-learning services based on ontology, the ontology technology is applied to the publication and discovery process of e-learning services, in order to realize accurate and efficient retrieval and utilization of e-learning services.

  10. An open-shell restricted Hartree-Fock perturbation theory based on symmetric spin orbitals

    NASA Technical Reports Server (NTRS)

    Lee, Timothy J.; Jayatilaka, Dylan

    1993-01-01

    A new open-shell perturbation theory is formulated in terms of symmetric spin orbitals. Only one set of spatial orbitals is required, thereby reducing the number of independent coefficients in the perturbed wavefunctions. For second order, the computational cost is shown to be similar to a closed-shell calculation. This formalism is therefore more efficient than the recently developed RMP, ROMP or RMP-MBPT theories. The perturbation theory described herein was designed to have a close correspondence with our recently proposed coupled-cluster theory based on symmetric spin orbitals. The first-order wavefunction contains contributions from only doubly excited determinants. Equilibrium structures and vibrational frequencies determined from second-order perturbation theory are presented for OH, NH, CH, 02, NH2 and CH2.

  11. Models, strategies, and tools. Theory in implementing evidence-based findings into health care practice.

    PubMed

    Sales, Anne; Smith, Jeffrey; Curran, Geoffrey; Kochevar, Laura

    2006-02-01

    This paper presents a case for careful consideration of theory in planning to implement evidence-based practices into clinical care. As described, theory should be tightly linked to strategic planning through careful choice or creation of an implementation framework. Strategies should be linked to specific interventions and/or intervention components to be implemented, and the choice of tools should match the interventions and overall strategy, linking back to the original theory and framework. The thesis advanced is that in most studies where there is an attempt to implement planned change in clinical processes, theory is used loosely. An example of linking theory to intervention design is presented from a Mental Health Quality Enhancement Research Initiative effort to increase appropriate use of antipsychotic medication among patients with schizophrenia in the Veterans Health Administration. PMID:16637960

  12. Venture Capital Investment Base on Grey Relational Theory

    NASA Astrophysics Data System (ADS)

    Zhang, Xubo

    This paper builds a venture capital investment projects selection evaluation model base on risk-weight investment return using grey relational analysis. The risk and return in venture capital investment projects selection process is analyses. These risk and return mainly constricted in management ability, operation ability, market ability, exit obtain and investment cost. The 18 sub-indicators are the impact factors contributed to these five evaluation aspects. Grey relation analysis is use to evaluate the venture capital investment selection. Get the optimal solution of risk-weight double objective investment selection evaluation model. An example is used to demonstrate the model in this paper.

  13. Mindfulness-based cognitive therapy: theory and practice.

    PubMed

    Sipe, Walter E B; Eisendrath, Stuart J

    2012-02-01

    Mindfulness-based cognitive therapy (MBCT) incorporates elements of cognitive-behavioural therapy with mindfulness-based stress reduction into an 8-session group program. Initially conceived as an intervention for relapse prevention in people with recurrent depression, it has since been applied to various psychiatric conditions. Our paper aims to briefly describe MBCT and its putative mechanisms of action, and to review the current findings about the use of MBCT in people with mood and anxiety disorders. The therapeutic stance of MBCT focuses on encouraging patients to adopt a new way of being and relating to their thoughts and feelings, while placing little emphasis on altering or challenging specific cognitions. Preliminary functional neuroimaging studies are consistent with an account of mindfulness improving emotional regulation by enhancing cortical regulation of limbic circuits and attentional control. Research findings from several randomized controlled trials suggest that MBCT is a useful intervention for relapse prevention in patients with recurrent depression, with efficacy that may be similar to maintenance antidepressants. Preliminary studies indicate MBCT also shows promise in the treatment of active depression, including treatment-resistant depression. Pilot studies have also evaluated MBCT in bipolar disorder and anxiety disorders. Patient and clinician resources for further information on mindfulness and MBCT are provided. PMID:22340145

  14. Safety models incorporating graph theory based transit indicators.

    PubMed

    Quintero, Liliana; Sayed, Tarek; Wahba, Mohamed M

    2013-01-01

    There is a considerable need for tools to enable the evaluation of the safety of transit networks at the planning stage. One interesting approach for the planning of public transportation systems is the study of networks. Network techniques involve the analysis of systems by viewing them as a graph composed of a set of vertices (nodes) and edges (links). Once the transport system is visualized as a graph, various network properties can be evaluated based on the relationships between the network elements. Several indicators can be calculated including connectivity, coverage, directness and complexity, among others. The main objective of this study is to investigate the relationship between network-based transit indicators and safety. The study develops macro-level collision prediction models that explicitly incorporate transit physical and operational elements and transit network indicators as explanatory variables. Several macro-level (zonal) collision prediction models were developed using a generalized linear regression technique, assuming a negative binomial error structure. The models were grouped into four main themes: transit infrastructure, transit network topology, transit route design, and transit performance and operations. The safety models showed that collisions were significantly associated with transit network properties such as: connectivity, coverage, overlapping degree and the Local Index of Transit Availability. As well, the models showed a significant relationship between collisions and some transit physical and operational attributes such as the number of routes, frequency of routes, bus density, length of bus and 3+ priority lanes. PMID:22831497

  15. [The Chinese urban metabolisms based on the emergy theory].

    PubMed

    Song, Tao; Cai, Jian-Ming; Ni, Pan; Yang, Zhen-Shan

    2014-04-01

    By using emergy indices of urban metabolisms, this paper analyzed 31 Chinese urban metabolisms' systematic structures and characteristics in 2000 and 2010. The results showed that Chinese urban metabolisms were characterized as resource consumption and coastal external dependency. Non-renewable resource emergy accounted for a higher proportion of the total emergy in the inland cities' urban metabolisms. The emergy of imports and exports accounted for the vast majority of urban metabolic systems in metropolises and coastal cities such as Beijing and Shanghai, showing a significant externally-oriented metabolic characteristic. Based on that, the related policies were put forward: to develop the renewable resource and energy industry; to improve the non-renewable resource and energy utilization efficiencies; to optimize the import and export structure of services, cargo and fuel; and to establish the flexible management mechanism of urban metabolisms. PMID:25011303

  16. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  17. Tire grip identification based on strain information: Theory and simulations

    NASA Astrophysics Data System (ADS)

    Carcaterra, A.; Roveri, N.

    2013-12-01

    A novel technique for the identification of the tire-road grip conditions is presented. This is based on the use of strain information inside the tire, from which relevant characteristics of the tire-road contact can be extracted also through a factor named area slip ratio. This process forms the basis of a technology for grip identification that requires a new model of the tire dynamics. The model permits to determine closed form analytical relationships between the measured strain and the area slip ratio. On this basis, a procedure that can extract the contact kinematic parameter from the time history of the internal strain of the rolling tire is presented. Numerical simulations offer the chance to validate the identification algorithm.

  18. The Effect Of The Materials Based On Multiple Intelligence Theory Upon The Intelligence Groups' Learning Process

    NASA Astrophysics Data System (ADS)

    Oral, I.; Dogan, O.

    2007-04-01

    The aim of this study is to find out the effect of the course materials based on Multiple Intelligence Theory upon the intelligence groups' learning process. In conclusion, the results proved that the materials prepared according to Multiple Intelligence Theory have a considerable effect on the students' learning process. This effect was particularly seen on the student groups of the musical-rhythmic, verbal-linguistic, interpersonal-social and naturalist intelligence.

  19. Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory

    NASA Astrophysics Data System (ADS)

    Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui

    The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.

  20. Paying for express checkout: competition and price discrimination in multi-server queuing systems.

    PubMed

    Deck, Cary; Kimbrough, Erik O; Mongrain, Steeve

    2014-01-01

    We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809

  1. Paying for Express Checkout: Competition and Price Discrimination in Multi-Server Queuing Systems

    PubMed Central

    Deck, Cary; Kimbrough, Erik O.; Mongrain, Steeve

    2014-01-01

    We model competition between two firms selling identical goods to customers who arrive in the market stochastically. Shoppers choose where to purchase based upon both price and the time cost associated with waiting for service. One seller provides two separate queues, each with its own server, while the other seller has a single queue and server. We explore the market impact of the multi-server seller engaging in waiting cost-based-price discrimination by charging a premium for express checkout. Specifically, we analyze this situation computationally and through the use of controlled laboratory experiments. We find that this form of price discrimination is harmful to sellers and beneficial to consumers. When the two-queue seller offers express checkout for impatient customers, the single queue seller focuses on the patient shoppers thereby driving down prices and profits while increasing consumer surplus. PMID:24667809

  2. Microfluidic, Bead-Based Assay: Theory and Experiments

    PubMed Central

    Thompson, Jason A.; Bau, Haim H.

    2009-01-01

    Microbeads are frequently used as a solid support for biomolecules such as proteins and nucleic acids in heterogeneous microfluidic assays. However, relatively few studies investigate the binding kinetics on modified bead surfaces in a microfluidics context. In this study, a customized hot embossing technique is used to stamp microwells in a thin plastic substrate where streptavidin-coated agarose beads are selectively placed and subsequently immobilized within a conduit. Biotinylated quantum dots are used as a label to monitor target analyte binding to the bead's surface. Three-dimensional finite element simulations are carried out to model the binding kinetics on the bead's surface. The model accounts for surface exclusion effects resulting from a single quantum dot occluding multiple receptor sites. The theoretical predictions are compared and favorably agree with experimental observations. The theoretical simulations provide a useful tool to predict how varying parameters affect microbead reaction kinetics and sensor performance. This study enhances our understanding of bead-based microfluidic assays and provides a design tool for developers of point-of-care, lab-on-chip devices for medical diagnosis, food and water quality inspection, and environmental monitoring. PMID:19766545

  3. Score Reliability of a Test Composed of Passage-Based Testlets: A Generalizability Theory Perspective.

    ERIC Educational Resources Information Center

    Lee, Yong-Won

    The purpose of this study was to investigate the impact of local item dependence (LID) in passage-based testlets on the test score reliability of an English as a Foreign Language (EFL) reading comprehension test from the perspective of generalizability (G) theory. Definitions and causes of LID in passage-based testlets are reviewed within the…

  4. Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment

    ERIC Educational Resources Information Center

    Chen, Jing

    2012-01-01

    Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)

  5. Constraint-Based Modeling: From Cognitive Theory to Computer Tutoring--and Back Again

    ERIC Educational Resources Information Center

    Ohlsson, Stellan

    2016-01-01

    The ideas behind the constraint-based modeling (CBM) approach to the design of intelligent tutoring systems (ITSs) grew out of attempts in the 1980's to clarify how declarative and procedural knowledge interact during skill acquisition. The learning theory that underpins CBM was based on two conceptual innovations. The first innovation was to

  6. Using Game Theory and Competition-Based Learning to Stimulate Student Motivation and Performance

    ERIC Educational Resources Information Center

    Burguillo, Juan C.

    2010-01-01

    This paper introduces a framework for using Game Theory tournaments as a base to implement Competition-based Learning (CnBL), together with other classical learning techniques, to motivate the students and increase their learning performance. The paper also presents a description of the learning activities performed along the past ten years of a…

  7. Constraint-Based Modeling: From Cognitive Theory to Computer Tutoring--and Back Again

    ERIC Educational Resources Information Center

    Ohlsson, Stellan

    2016-01-01

    The ideas behind the constraint-based modeling (CBM) approach to the design of intelligent tutoring systems (ITSs) grew out of attempts in the 1980's to clarify how declarative and procedural knowledge interact during skill acquisition. The learning theory that underpins CBM was based on two conceptual innovations. The first innovation was to…

  8. Controlling Retrieval during Practice: Implications for Memory-Based Theories of Automaticity

    ERIC Educational Resources Information Center

    Wilkins, Nicolas J.; Rawson, Katherine A.

    2011-01-01

    Memory-based processing theories of automaticity assume that shifts from algorithmic to retrieval-based processing underlie practice effects on response times. The current work examined the extent to which individuals can exert control over the involvement of retrieval during skill acquisition and the factors that may influence control. In two…

  9. Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment

    ERIC Educational Resources Information Center

    Chen, Jing

    2012-01-01

    Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…

  10. Kinematics and dynamics of deployable structures with scissor-like-elements based on screw theory

    NASA Astrophysics Data System (ADS)

    Sun, Yuantao; Wang, Sanmin; Mills, James K.; Zhi, Changjian

    2014-07-01

    Because the deployable structures are complex multi-loop structures and methods of derivation which lead to simpler kinematic and dynamic equations of motion are the subject of research effort, the kinematics and dynamics of deployable structures with scissor-like-elements are presented based on screw theory and the principle of virtual work respectively. According to the geometric characteristic of the deployable structure examined, the basic structural unit is the common scissor-like-element(SLE). First, a spatial deployable structure, comprised of three SLEs, is defined, and the constraint topology graph is obtained. The equations of motion are then derived based on screw theory and the geometric nature of scissor elements. Second, to develop the dynamics of the whole deployable structure, the local coordinates of the SLEs and the Jacobian matrices of the center of mass of the deployable structure are derived. Then, the equivalent forces are assembled and added in the equations of motion based on the principle of virtual work. Finally, dynamic behavior and unfolded process of the deployable structure are simulated. Its figures of velocity, acceleration and input torque are obtained based on the simulate results. Screw theory not only provides an efficient solution formulation and theory guidance for complex multi-closed loop deployable structures, but also extends the method to solve dynamics of deployable structures. As an efficient mathematical tool, the simper equations of motion are derived based on screw theory.

  11. Quantifying Uncertainty in Physics-Based Models with Measure Theory

    NASA Astrophysics Data System (ADS)

    Butler, T.

    2014-12-01

    The ultimate goal in scientific inference is to predict some unobserved behavior of a system often described by a specific set of quantities of interest (QoI) computed from the solution to a mathematical model. For example, given a contaminant transport model in a regional aquifer with specified porosity, initial concentrations, etc., we may analyze remediation strategies in order to achieve threshold tolerances in certain wells. Solution to this prediction problem is complicated by several sources of uncertainty. A primary source of uncertainty is in the specification of the input data and parameters that characterize the physical properties of a given state of the system. It is often impossible to experimentally observe the input data and parameters that characterize the physical properties of the modeled system, and any observable QoI are generally of the state of the system itself and may differ substantially from the QoI to be predicted. Thus, we must first solve an inverse problem using observable QoI to quantify uncertainties in the inputs to the model. The inverse problem is complicated by several issues. The solution of the model induces a "QoI map" from the space of model inputs to the observable QoI computed from the solution of the model. This QoI map is generally a "many-to-few" map that reduces the dimension implying that the deterministic inverse problem has set-valued solutions. Additionally, available data on the QoI are subject to natural variation and experimental/observational error modeled as probability measures implying that solutions of the inverse problem and forward prediction problem are given in terms of probability measures. We describe a novel measure-theoretic framework for the formulation and solution of the stochastic inverse problem for a deterministic physics-based model requiring minimal assumptions. A computational algorithm and error analysis are described accounting for both deterministic and stochastic sources of error.

  12. The Cultures of Contemporary Instructional Design Scholarship, Part Two: Developments Based on Constructivist and Critical Theory Foundations

    ERIC Educational Resources Information Center

    Willis, Jerry

    2011-01-01

    This article is the second in a series (see Willis, 2011) that looks at the current status of instructional design scholarship and theory. In this concluding article, the focus is on two cultures of ID work, one based on constructivist and interpretivist theory and the other based on critical theory and critical pedagogy. There are distinct…

  13. A Christian faith-based recovery theory: understanding God as sponsor.

    PubMed

    Timmons, Shirley M

    2012-12-01

    This article reports the development of a substantive theory to explain an evangelical Christian-based process of recovery from addiction. Faith-based, 12-step, mutual aid programs can improve drug abstinence by offering: (a) an intervention option alone and/or in conjunction with secular programs and (b) an opportunity for religious involvement. Although literature on religion, spirituality, and addiction is voluminous, traditional 12-step programs fail to explain the mechanism that underpins the process of Christian-based recovery (CR). This pilot study used grounded theory to explore and describe the essence of recovery of 10 former crack cocaine-addicted persons voluntarily enrolled in a CR program. Data were collected from in-depth interviews during 4 months of 2008. Audiotapes were transcribed verbatim, and the constant comparative method was used to analyze data resulting in the basic social process theory, understanding God as sponsor. The theory was determined through writing theoretical memos that generated key elements that allow persons to recover: acknowledging God-centered crises, communicating with God, and planning for the future. Findings from this preliminary study identifies important factors that can help persons in recovery to sustain sobriety and program administrators to benefit from theory that guides the development of evidence-based addiction interventions. PMID:21046250

  14. Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory

    NASA Astrophysics Data System (ADS)

    Matsumura, Koki; Kawamoto, Masaru

    This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.

  15. Social learning theory parenting intervention promotes attachment-based caregiving in young children: randomized clinical trial.

    PubMed

    O'Connor, Thomas G; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen

    2013-01-01

    Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social learning theory-based treatment promoted change in qualities of parent-child relationship derived from attachment theory. A randomized clinical trial of 174 four- to six-year-olds selected from a high-need urban area and stratified by conduct problems were assigned to a parenting program plus a reading intervention (n = 88) or nonintervention condition (n = 86). In-home observations of parent-child interactions were assessed in three tasks: (a) free play, (b) challenge task, and (c) tidy up. Parenting behavior was coded according to behavior theory using standard count measures of positive and negative parenting, and for attachment theory using measures of sensitive responding and mutuality; children's attachment narratives were also assessed. Compared to the parents in the nonintervention group, parents allocated to the intervention showed increases in the positive behavioral counts and sensitive responding; change in behavioral count measures overlapped modestly with change in attachment-based changes. There was no reliable change in children's attachment narratives associated with the intervention. The findings demonstrate that standard social learning theory-based parenting interventions can change broader aspects of parent-child relationship quality and raise clinical and conceptual questions about the distinctiveness of existing treatment models in parenting research. PMID:23020146

  16. Critical temperature of trapped interacting bosons from large-N -based theories

    NASA Astrophysics Data System (ADS)

    Kim, Tom; Chien, Chih-Chun

    2016-03-01

    Ultracold atoms provide clues to an important many-body problem regarding the dependence of the Bose-Einstein condensation transition temperature Tc on interactions. However, cold atoms are trapped in harmonic potentials and theoretical evaluations of the Tc shift of trapped interacting Bose gases are challenging. While previous predictions of the leading-order shift have been confirmed, more recent experiments exhibit higher-order corrections beyond available mean-field theories. By implementing two large-N -based theories with the local density approximation (LDA), we extract next-order corrections of the Tc shift. The leading-order large-N theory produces results quantitatively different from the latest experimental data. The leading-order auxiliary-field (LOAF) theory, containing both normal and anomalous density fields, captures the Tc shift accurately in the weak-interaction regime. However, the LOAF theory shows incompatible behavior with the LDA, and forcing the LDA leads to density discontinuities in the trap profiles. We present a phenomenological model based on the LOAF theory, which repairs the incompatibility and provides a prediction of the Tc shift in the stronger-interaction regime.

  17. Vervet monkeys solve a multiplayer "forbidden circle game" by queuing to learn restraint.

    PubMed

    Fruteau, Cécile; van Damme, Eric; Noë, Ronald

    2013-04-22

    In social dilemmas, the ability of individuals to coordinate their actions is crucial to reach group optima. Unless exacted by power or force, coordination in humans relies on a common understanding of the problem, which is greatly facilitated by communication. The lack of means of consultation about the nature of the problem and how to solve it may explain why multiagent coordination in nonhuman vertebrates has commonly been observed only when multiple individuals react instantaneously to a single stimulus, either natural or experimentally simulated, for example a predator, a prey, or a neighboring group. Here we report how vervet monkeys solved an experimentally induced coordination problem. In each of three groups, we trained a low-ranking female, the "provider," to open a container holding a large amount of food, which the providers only opened when all individuals dominant to them ("dominants") stayed outside an imaginary "forbidden circle" around it. Without any human guidance, the dominants learned restraint one by one, in hierarchical order from high to low. Once all dominants showed restraint immediately at the start of the trial, the providers opened the container almost instantly, saving all individuals opportunity costs due to lost foraging time. Solving this game required trial-and-error learning based on individual feedback from the provider to each dominant, and all dominants being patient enough to wait outside the circle while others learned restraint. Communication, social learning, and policing by high-ranking animals played no perceptible role. PMID:23541727

  18. A queuing model for designing multi-modality buried target detection systems: preliminary results

    NASA Astrophysics Data System (ADS)

    Malof, Jordan M.; Morton, Kenneth D.; Collins, Leslie M.; Torrione, Peter A.

    2015-05-01

    Many remote sensing modalities have been developed for buried target detection, each one offering its own relative advantages over the others. As a result there has been interest in combining several modalities into a single detection platform that benefits from the advantages of each constituent sensor, without suffering from their weaknesses. Traditionally this involves collecting data continuously on all sensors and then performing data, feature, or decision level fusion. While this is effective for lowering false alarm rates, this strategy neglects the potential benefits of a more general system-level fusion architecture. Such an architecture can involve dynamically changing which modalities are in operation. For example, a large standoff modality such as a forward-looking infrared (FLIR) camera can be employed until an alarm is encountered, at which point a high performance (but short standoff) sensor, such as ground penetrating radar (GPR), is employed. Because the system is dynamically changing its rate of advance and sensors, it becomes difficult to evaluate the expected false alarm rate and advance rate. In this work, a probabilistic model is proposed that can be used to estimate these quantities based on a provided operating policy. In this model the system consists of a set of states (e.g., sensors employed) and conditions encountered (e.g., alarm locations). The predictive accuracy of the model is evaluated using a collection of collocated FLIR and GPR data and the results indicate that the model is effective at predicting the desired system metrics.

  19. Research on Prediction Model of Time Series Based on Fuzzy Theory and Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Xiao-qin, Wu

    Fuzzy theory is one of the newly adduced self-adaptive strategies,which is applied to dynamically adjust the parameters o genetic algorithms for the purpose of enhancing the performance.In this paper, the financial time series analysis and forecasting as the main case study to the theory of soft computing technology framework that focuses on the fuzzy theory and genetic algorithms(FGA) as a method of integration. the financial time series forecasting model based on fuzzy theory and genetic algorithms was built. the ShangZheng index cards as an example. The experimental results show that FGA perform s much better than BP neural network, not only in the precision, but also in the searching speed.The hybrid algorithm has a strong feasibility and superiority.

  20. The use of density functional theory-based reactivity descriptors in molecular similarity calculations

    NASA Astrophysics Data System (ADS)

    Boon, Greet; De Proft, Frank; Langenaeker, Wilfried; Geerlings, Paul

    1998-10-01

    Molecular similarity is studied via density functional theory-based similarity indices using a numerical integration method. Complementary to the existing similarity indices, we introduce a reactivity-related similarity index based on the local softness. After a study of some test systems, a series of peptide isosteres is studied in view of their importance in pharmacology. The whole of the present work illustrates the importance of the study of molecular similarity based on both shape and reactivity.

  1. Theory of normal and superconducting properties of fullerene-based solids

    SciTech Connect

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a ``standard model`` of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.

  2. Theory of normal and superconducting properties of fullerene-based solids

    SciTech Connect

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a standard model'' of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes.

  3. The Scientific Value of Cognitive Load Theory: A Research Agenda Based on the Structuralist View of Theories

    ERIC Educational Resources Information Center

    Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele

    2009-01-01

    In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…

  4. Power optimization of chemically driven heat engine based on first and second order reaction kinetic theory and probability theory

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Chen, Lingen; Sun, Fengrui

    2016-03-01

    The finite-time thermodynamic method based on probability analysis can more accurately describe various performance parameters of thermodynamic systems. Based on the relation between optimal efficiency and power output of a generalized Carnot heat engine with a finite high-temperature heat reservoir (heat source) and an infinite low-temperature heat reservoir (heat sink) and with the only irreversibility of heat transfer, this paper studies the problem of power optimization of chemically driven heat engine based on first and second order reaction kinetic theory, puts forward a model of the coupling heat engine which can be run periodically and obtains the effects of the finite-time thermodynamic characteristics of the coupling relation between chemical reaction and heat engine on the power optimization. The results show that the first order reaction kinetics model can use fuel more effectively, and can provide heat engine with higher temperature heat source to increase the power output of the heat engine. Moreover, the power fluctuation bounds of the chemically driven heat engine are obtained by using the probability analysis method. The results may provide some guidelines for the character analysis and power optimization of the chemically driven heat engines.

  5. Studying thin film damping in a micro-beam resonator based on non-classical theories

    NASA Astrophysics Data System (ADS)

    Ghanbari, Mina; Hossainpour, Siamak; Rezazadeh, Ghader

    2015-09-01

    In this paper, a mathematical model is presented for studying thin film damping of the surrounding fluid in an in-plane oscillating micro-beam resonator. The proposed model for this study is made up of a clamped-clamped micro-beam bound between two fixed layers. The micro-gap between the micro-beam and fixed layers is filled with air. As classical theories are not properly capable of predicting the size dependence behaviors of the micro-beam, and also behavior of micro-scale fluid media, hence in the presented model, equation of motion governing longitudinal displacement of the micro-beam has been extracted based on non-local elasticity theory. Furthermore, the fluid field has been modeled based on micro-polar theory. These coupled equations have been simplified using Newton-Laplace and continuity equations. After transforming to non-dimensional form and linearizing, the equations have been discretized and solved simultaneously using a Galerkin-based reduced order model. Considering slip boundary conditions and applying a complex frequency approach, the equivalent damping ratio and quality factor of the micro-beam resonator have been obtained. The obtained values for the quality factor have been compared to those based on classical theories. We have shown that applying non-classical theories underestimate the values of the quality factor obtained based on classical theories. The effects of geometrical parameters of the micro-beam and micro-scale fluid field on the quality factor of the resonator have also been investigated.

  6. An anisotropic constitutive equation for the stress tensor of blood based on mixture theory

    SciTech Connect

    Massoudi, M.; Antaki, J.

    2008-01-01

    Based on ideas proposed by Massoudi and Rajagopal #2;M-R#3;, we develop a model for blood using the theory of interacting continua, that is, the mixture theory. We first provide a brief review of mixture theory, and then discuss certain issues in constitutive modeling of a two-component mixture. In the present formulation, we ignore the biochemistry of blood and assume that blood is composed of red blood cells #2;RBCs#3; suspended in plasma, where the plasma behaves as a linearly viscous fluid and the RBCs are modeled as an anisotropic nonlinear density-gradient-type fluid. We obtain a constitutive relation for blood, based on the simplified constitutive relations derived for plasma and RBCs. A simple shear flow is discussed, and an exact solution is obtained for a very special case; for more general cases, it is necessary to solve the nonlinear coupled equations numerically.

  7. An anisotropic constitutive equation for the stress tensor of blood based on mixture theory

    SciTech Connect

    Massoudi, Mehrdad; Antaki, J.F.

    2008-09-12

    Based on ideas proposed by Massoudi and Rajagopal (M-R), we develop a model for blood using the theory of interacting continua, that is, the mixture theory. We first provide a brief review of mixture theory, and then discuss certain issues in constitutive modeling of a two-component mixture. In the present formulation, we ignore the biochemistry of blood and assume that blood is composed of red blood cells (RBCs) suspended in plasma, where the plasma behaves as a linearly viscous fluid and the RBCs are modeled as an anisotropic nonlinear density-gradient-type fluid. We obtain a constitutive relation for blood, based on the simplified constitutive relations derived for plasma and RBCs. A simple shear flow is discussed, and an exact solution is obtained for a very special case; for more general cases, it is necessary to solve the nonlinear coupled equations numerically.

  8. Invited commentary: Agent-based models for causal inference—reweighting data and theory in epidemiology.

    PubMed

    Hernán, Miguel A

    2015-01-15

    The relative weights of empirical facts (data) and assumptions (theory) in causal inference vary across disciplines. Typically, disciplines that ask more complex questions tend to better tolerate a greater role of theory and modeling in causal inference. As epidemiologists move toward increasingly complex questions, Marshall and Galea (Am J Epidemiol. 2015;181(2):92-99) support a reweighting of data and theory in epidemiologic research via the use of agent-based modeling. The parametric g-formula can be viewed as an intermediate step between traditional epidemiologic methods and agent-based modeling and therefore is a method that can ease the transition toward epidemiologic methods that rely heavily on modeling. PMID:25480820

  9. Mixture theory-based poroelasticity as a model of interstitial tissue growth

    PubMed Central

    Cowin, Stephen C.; Cardoso, Luis

    2011-01-01

    This contribution presents an alternative approach to mixture theory-based poroelasticity by transferring some poroelastic concepts developed by Maurice Biot to mixture theory. These concepts are a larger RVE and the subRVE-RVE velocity average tensor, which Biot called the micro-macro velocity average tensor. This velocity average tensor is assumed here to depend upon the pore structure fabric. The formulation of mixture theory presented is directed toward the modeling of interstitial growth, that is to say changing mass and changing density of an organism. Traditional mixture theory considers constituents to be open systems, but the entire mixture is a closed system. In this development the mixture is also considered to be an open system as an alternative method of modeling growth. Growth is slow and accelerations are neglected in the applications. The velocity of a solid constituent is employed as the main reference velocity in preference to the mean velocity concept from the original formulation of mixture theory. The standard development of statements of the conservation principles and entropy inequality employed in mixture theory are modified to account for these kinematic changes and to allow for supplies of mass, momentum and energy to each constituent and to the mixture as a whole. The objective is to establish a basis for the development of constitutive equations for growth of tissues. PMID:22184481

  10. Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders

    ERIC Educational Resources Information Center

    Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel

    2012-01-01

    This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…

  11. Poverty Lines Based on Fuzzy Sets Theory and Its Application to Malaysian Data

    ERIC Educational Resources Information Center

    Abdullah, Lazim

    2011-01-01

    Defining the poverty line has been acknowledged as being highly variable by the majority of published literature. Despite long discussions and successes, poverty line has a number of problems due to its arbitrary nature. This paper proposes three measurements of poverty lines using membership functions based on fuzzy set theory. The three…

  12. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are

  13. Examining Instruction in MIDI-Based Composition through a Critical Theory Lens

    ERIC Educational Resources Information Center

    Louth, Paul

    2013-01-01

    This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…

  14. Examining Instruction in MIDI-based Composition through a Critical Theory Lens

    ERIC Educational Resources Information Center

    Louth, Paul

    2013-01-01

    This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…

  15. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  16. Examining Instruction in MIDI-Based Composition through a Critical Theory Lens

    ERIC Educational Resources Information Center

    Louth, Paul

    2013-01-01

    This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional

  17. Nuclear Forces and Few-Nucleon Studies Based on Chiral Perturbation Theory

    SciTech Connect

    W. Gloeckle; E. Epelbaum; U.G. Meissner; A. Nogga; H. Kamada; H. Witala

    2003-11-01

    After a brief review on the status of few--nucleon studies based on conventional nuclear forces, we sketch the concepts of the effective field theory approach constrained by chiral symmetry and its application to nuclear forces. Then first results for few--nucleon observables are discussed.

  18. Operationalizing Levels of Academic Mastery Based on Vygotsky's Theory: The Study of Mathematical Knowledge

    ERIC Educational Resources Information Center

    Nezhnov, Peter; Kardanova, Elena; Vasilyeva, Marina; Ludlow, Larry

    2015-01-01

    The present study tested the possibility of operationalizing levels of knowledge acquisition based on Vygotsky's theory of cognitive growth. An assessment tool (SAM-Math) was developed to capture a hypothesized hierarchical structure of mathematical knowledge consisting of procedural, conceptual, and functional levels. In Study 1, SAM-Math was…

  19. Theory and Utility-Key Themes in Evidence-Based Assessment: Comment on the Special Section

    ERIC Educational Resources Information Center

    McFall, Richard M.

    2005-01-01

    This article focuses on two key themes in the four featured reviews on evidence-based assessment. The first theme is the essential role of theory in psychological assessment. An overview of this complex, multilayered role is presented. The second theme is the need for a common metric with which to gauge the utility of specific psychological tests…

  20. Imitation dynamics of vaccine decision-making behaviours based on the game theory.

    PubMed

    Yang, Junyuan; Martcheva, Maia; Chen, Yuming

    2016-01-01

    Based on game theory, we propose an age-structured model to investigate the imitation dynamics of vaccine uptake. We first obtain the existence and local stability of equilibria. We show that Hopf bifurcation can occur. We also establish the global stability of the boundary equilibria and persistence of the disease. The theoretical results are supported by numerical simulations. PMID:26536171

  1. Preparing Students for Education, Work, and Community: Activity Theory in Task-Based Curriculum Design

    ERIC Educational Resources Information Center

    Campbell, Chris; MacPherson, Seonaigh; Sawkins, Tanis

    2014-01-01

    This case study describes how sociocultural and activity theory were applied in the design of a publicly funded, Canadian Language Benchmark (CLB)-based English as a Second Language (ESL) credential program and curriculum for immigrant and international students in postsecondary institutions in British Columbia, Canada. The ESL Pathways Project…

  2. Science Teaching Based on Cognitive Load Theory: Engaged Students, but Cognitive Deficiencies

    ERIC Educational Resources Information Center

    Meissner, Barbara; Bogner, Franz X.

    2012-01-01

    To improve science learning under demanding conditions, we designed an out-of-school lesson in compliance with cognitive load theory (CLT). We extracted student clusters based on individual effectiveness, and compared instructional efficiency, mental effort, and persistence of learning. The present study analyses students' engagement. 50.0% of our…

  3. Applications of Cognitive Load Theory to Multimedia-Based Foreign Language Learning: An Overview

    ERIC Educational Resources Information Center

    Chen, I-Jung; Chang, Chi-Cheng; Lee, Yen-Chang

    2009-01-01

    This article reviews the multimedia instructional design literature based on cognitive load theory (CLT) in the context of foreign language learning. Multimedia are of particular importance in language learning materials because they incorporate text, image, and sound, thus offering an integrated learning experience of the four language skills…

  4. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…

  5. A Theory-Based Approach to Reading Assessment in the Army. Technical Report 625.

    ERIC Educational Resources Information Center

    Oxford-Carpenter, Rebecca L.; Schultz-Shiner, Linda J.

    Noting that the United States Army Research Institute for the Behavioral and Social Sciences (ARI) has been involved in research on reading assessment in the Army from both practical and theoretical perspectives, this paper addresses practical Army problems in reading assessment from a theory base that reflects the most recent and most sound…

  6. From Theory to Practice: Concept-Based Inquiry in a High School Art Classroom

    ERIC Educational Resources Information Center

    Walker, Margaret A.

    2014-01-01

    This study examines what an emerging educational theory looks like when put into practice in an art classroom. It explores the teaching methodology of a high school art teacher who has utilized concept-based inquiry in the classroom to engage his students in artmaking and analyzes the influence this methodology has had on his adolescent students.…

  7. Effects of a Theory-Based, Peer-Focused Drug Education Course.

    ERIC Educational Resources Information Center

    Gonzalez, Gerardo M.

    1990-01-01

    Describes innovative, theory-based, peer-focused college drug education academic course and its effect on perceived levels of risk associated with the use of alcohol, marijuana, and cocaine. Evaluation of the effects of the course indicated the significant effect on perceived risk of cocaine, but not alcohol or marijuana. (Author/ABL)

  8. Application of Online Multimedia Courseware in College English Teaching Based on Constructivism Theory

    ERIC Educational Resources Information Center

    Li, Zhenying

    2012-01-01

    Based on Constructivism Theory, this paper aims to investigate the application of online multimedia courseware to college English teaching. By making experiments and students' feedback, some experience has been accumulated, and some problems are discovered and certain revelations are acquired as well in English teaching practice, which pave the…

  9. Interpretation-Based Processing: A Unified Theory of Semantic Sentence Comprehension

    ERIC Educational Resources Information Center

    Budiu, Raluca; Anderson, John R.

    2004-01-01

    We present interpretation-based processing--a theory of sentence processing that builds a syntactic and a semantic representation for a sentence and assigns an interpretation to the sentence as soon as possible. That interpretation can further participate in comprehension and in lexical processing and is vital for relating the sentence to the…

  10. Revisiting Transactional Distance Theory in a Context of Web-Based High-School Distance Education

    ERIC Educational Resources Information Center

    Murphy, Elizabeth Anne; Rodriguez-Manzanares, Maria Angeles

    2008-01-01

    The purpose of this paper is to report on a study that provided an opportunity to consider Transactional Distance Theory (TDT) in a current technology context of web-based learning in distance education (DE), high-school classrooms. Data collection relied on semi-structured interviews conducted with 22 e-teachers and managers in Newfoundland and…

  11. Applying Unidimensional and Multidimensional Item Response Theory Models in Testlet-Based Reading Assessment

    ERIC Educational Resources Information Center

    Min, Shangchao; He, Lianzhen

    2014-01-01

    This study examined the relative effectiveness of the multidimensional bi-factor model and multidimensional testlet response theory (TRT) model in accommodating local dependence in testlet-based reading assessment with both dichotomously and polytomously scored items. The data used were 14,089 test-takers' item-level responses to the…

  12. Evidence-Based Practice in Kinesiology: The Theory to Practice Gap Revisited

    ERIC Educational Resources Information Center

    Knudson, Duane

    2005-01-01

    As evidence-based practice sweeps the applied health professions, it is a good time to evaluate the generation of knowledge in Kinesiology and its transmission to professionals and the public. Knowledge transmission has been debated in the past from the perspectives of the theory-to-practice gap and the discipline versus profession emphasis.…

  13. A Practice-Based Theory of Professional Education: Teach For America's Professional Development Model

    ERIC Educational Resources Information Center

    Gabriel, Rachael

    2011-01-01

    In 1999, Ball and Cohen proposed a practice-based theory of professional education, which would end inadequate professional development efforts with a more comprehensive approach. Their work has been referenced over the past decade, yet there have been limited attempts to actualize their ideals and research their implications. In this article, I…

  14. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    ERIC Educational Resources Information Center

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  15. Effects of Guided Writing Strategies on Students' Writing Attitudes Based on Media Richness Theory

    ERIC Educational Resources Information Center

    Lan, Yu-Feng; Hung, Chun-Ling; Hsu, Hung-Ju

    2011-01-01

    The purpose of this paper is to develop different guided writing strategies based on media richness theory and further evaluate the effects of these writing strategies on younger students' writing attitudes in terms of motivation, enjoyment and anxiety. A total of 66 sixth-grade elementary students with an average age of twelve were invited to…

  16. Predicting Study Abroad Intentions Based on the Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Schnusenberg, Oliver; de Jong, Pieter; Goel, Lakshmi

    2012-01-01

    The emphasis on study abroad programs is growing in the academic context as U.S. based universities seek to incorporate a global perspective in education. Using a model that has underpinnings in the theory of planned behavior (TPB), we predict students' intention to participate in short-term study abroad program. We use TPB to identify behavioral,…

  17. Theory Presentation and Assessment in a Problem-Based Learning Group.

    ERIC Educational Resources Information Center

    Glenn, Phillip J.; Koschmann, Timothy; Conlee, Melinda

    A study used conversational analysis to examine the reasoning students use in a Problem-Based Learning (PBL) environment as they formulate a theory (in medical contexts, a diagnosis) which accounts for evidence (medical history and symptoms). A videotaped group interaction was analyzed and transcribed. In the segment of interaction examined, the…

  18. Portuguese Public University Student Satisfaction: A Stakeholder Theory-Based Approach

    ERIC Educational Resources Information Center

    Mainardes, Emerson; Alves, Helena; Raposo, Mario

    2013-01-01

    In accordance with the importance of the student stakeholder to universities, the objective of this research project was to evaluate student satisfaction at Portuguese public universities as regards their self-expressed core expectations. The research was based both on stakeholder theory itself and on previous studies of university stakeholders.…

  19. Using Emergence Theory-Based Curriculum to Teach Compromise Skills to Students with Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Fein, Lance; Jones, Don

    2015-01-01

    This study addresses the compromise skills that are taught to students diagnosed with autistic spectrum disorders (ASD) and related social and communication deficits. A private school in the southeastern United States implemented an emergence theory-based curriculum to address these skills, yet no formal analysis was conducted to determine its…

  20. Theory-Based Development and Testing of an Adolescent Tobacco-Use Awareness Program.

    ERIC Educational Resources Information Center

    Smith, Dennis W.; Colwell, Brian; Zhang, James J.; Brimer, Jennifer; McMillan, Catherine; Stevens, Stacey

    2002-01-01

    The Adolescent Tobacco Use Awareness and Cessation Program trial, based on social cognitive theory and transtheoretical model, was designed to develop, evaluate, and disseminate effective cessation programming related to Texas legislation. Data from participants and site facilitators indicated that significantly more participants were in the…

  1. Investigating Acceptance toward Mobile Learning to Assist Individual Knowledge Management: Based on Activity Theory Approach

    ERIC Educational Resources Information Center

    Liaw, Shu-Sheng; Hatala, Marek; Huang, Hsiu-Mei

    2010-01-01

    Mobile devices could facilitate human interaction and access to knowledge resources anytime and anywhere. With respect to wide application possibilities of mobile learning, investigating learners' acceptance towards it is an essential issue. Based on activity theory approach, this research explores positive factors for the acceptance of m-learning…

  2. Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders

    ERIC Educational Resources Information Center

    Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel

    2012-01-01

    This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum

  3. Investigating Acceptance toward Mobile Learning to Assist Individual Knowledge Management: Based on Activity Theory Approach

    ERIC Educational Resources Information Center

    Liaw, Shu-Sheng; Hatala, Marek; Huang, Hsiu-Mei

    2010-01-01

    Mobile devices could facilitate human interaction and access to knowledge resources anytime and anywhere. With respect to wide application possibilities of mobile learning, investigating learners' acceptance towards it is an essential issue. Based on activity theory approach, this research explores positive factors for the acceptance of m-learning

  4. On Robustness of the Normal-Theory Based Asymptotic Distributions of Three Reliability Coefficient Estimates.

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Bentler, Peter M.

    2002-01-01

    Examined the asymptotic distributions of three reliability coefficient estimates: (1) sample coefficient alpha; (2) reliability estimate of a composite score following factor analysis; and (3) maximal reliability of a linear combination of item scores after factor analysis. Findings show that normal theory based asymptotic distributions for these…

  5. Evidence-Based Practice in Kinesiology: The Theory to Practice Gap Revisited

    ERIC Educational Resources Information Center

    Knudson, Duane

    2005-01-01

    As evidence-based practice sweeps the applied health professions, it is a good time to evaluate the generation of knowledge in Kinesiology and its transmission to professionals and the public. Knowledge transmission has been debated in the past from the perspectives of the theory-to-practice gap and the discipline versus profession emphasis.

  6. A Comparison of Measurement Equivalence Methods Based on Confirmatory Factor Analysis and Item Response Theory.

    ERIC Educational Resources Information Center

    Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.

    Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…

  7. Critically Evaluating Competing Theories: An Exercise Based on the Kitty Genovese Murder

    ERIC Educational Resources Information Center

    Sagarin, Brad J.; Lawler-Sagarin, Kimberly A.

    2005-01-01

    We describe an exercise based on the 1964 murder of Catherine Genovese--a murder observed by 38 witnesses, none of whom called the police. Students read a summary of the murder and worked in small groups to design an experiment to test the competing theories for the inaction of the witnesses (Americans' selfishness and insensitivity vs. diffusion…

  8. Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model

    ERIC Educational Resources Information Center

    de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.

    2011-01-01

    Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is

  9. Theory-Based Interactive Mathematics Instruction: Development and Validation of Computer-Video Modules.

    ERIC Educational Resources Information Center

    Henderson, Ronald W.; And Others

    Theory-based prototype computer-video instructional modules were developed to serve as an instructional supplement for students experiencing difficulty in learning mathematics, with special consideration given to students underrepresented in mathematics (particularly women and minorities). Modules focused on concepts and operations for factors,…

  10. Poverty Lines Based on Fuzzy Sets Theory and Its Application to Malaysian Data

    ERIC Educational Resources Information Center

    Abdullah, Lazim

    2011-01-01

    Defining the poverty line has been acknowledged as being highly variable by the majority of published literature. Despite long discussions and successes, poverty line has a number of problems due to its arbitrary nature. This paper proposes three measurements of poverty lines using membership functions based on fuzzy set theory. The three

  11. English Textbooks Based on Research and Theory--A Possible Dream.

    ERIC Educational Resources Information Center

    Suhor, Charles

    1984-01-01

    Research based text materials will probably never dominate the textbook market. To begin with, translating theory and research into practice is a chancy business. There are also creative problems such as the inherent oversimplification involved in textbook writing. Every textbook writer who has been a classroom teacher will acknowledge that such…

  12. Critically Evaluating Competing Theories: An Exercise Based on the Kitty Genovese Murder

    ERIC Educational Resources Information Center

    Sagarin, Brad J.; Lawler-Sagarin, Kimberly A.

    2005-01-01

    We describe an exercise based on the 1964 murder of Catherine Genovese--a murder observed by 38 witnesses, none of whom called the police. Students read a summary of the murder and worked in small groups to design an experiment to test the competing theories for the inaction of the witnesses (Americans' selfishness and insensitivity vs. diffusion

  13. A method for calculating strain energy release rate based on beam theory

    NASA Technical Reports Server (NTRS)

    Sun, C. T.; Pandey, R. K.

    1993-01-01

    The Timoshenko beam theory was used to model cracked beams and to calculate the total strain energy release rate. The root rotation of the beam segments at the crack tip were estimated based on an approximate 2D elasticity solution. By including the strain energy released due to the root rotations of the beams during crack extension, the strain energy release rate obtained using beam theory agrees very well with the 2D finite element solution. Numerical examples were given for various beam geometries and loading conditions. Comparisons with existing beam models were also given.

  14. Learning control system design based on 2-D theory - An application to parallel link manipulator

    NASA Technical Reports Server (NTRS)

    Geng, Z.; Carroll, R. L.; Lee, J. D.; Haynes, L. H.

    1990-01-01

    An approach to iterative learning control system design based on two-dimensional system theory is presented. A two-dimensional model for the iterative learning control system which reveals the connections between learning control systems and two-dimensional system theory is established. A learning control algorithm is proposed, and the convergence of learning using this algorithm is guaranteed by two-dimensional stability. The learning algorithm is applied successfully to the trajectory tracking control problem for a parallel link robot manipulator. The excellent performance of this learning algorithm is demonstrated by the computer simulation results.

  15. Improved method for calculating strain energy release rate based on beam theory

    NASA Technical Reports Server (NTRS)

    Sun, C. T.; Pandey, R. K.

    1994-01-01

    The Timoshenko beam theory was used to model cracked beams and to calculate the total strain-energy release rate. The root rotations of the beam segments at the crack tip were estimated based on an approximate two-dimensional elasticity solution. By including the strain energy released due to the root rotations of the beams during crack extension, the strain-energy release rate obtained using beam theory agrees very well with the two-dimensional finite element solution. Numerical examples were given for various beam geometries and loading conditions. Comparisons with existing beam models were also given.

  16. Hearing the voice of nurses in caring theory-based practice.

    PubMed

    Dyess, Susan MacLeod; Boykin, Anne; Bulfin, Mary Jo

    2013-04-01

    The authors in this paper describe the process and findings of a participatory action research project between a college of nursing and a for-profit acute healthcare organization as practice environment transformation occurred, grounded in caring theory. The participatory action research process and findings emphasize the importance of the intention to know what matters and the required time, courage, and commitment necessary to actualize practice environments that support nursing. Implications show efforts to develop and sustain theory-based practice environments that enable the full expression of nursing and a way of being that honors and celebrates the uniqueness of nurses. PMID:23575494

  17. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs. PMID:26062288

  18. What is an adequate sample size? Operationalising data saturation for theory-based interview studies.

    PubMed

    Francis, Jill J; Johnston, Marie; Robertson, Clare; Glidewell, Liz; Entwistle, Vikki; Eccles, Martin P; Grimshaw, Jeremy M

    2010-12-01

    In interview studies, sample size is often justified by interviewing participants until reaching 'data saturation'. However, there is no agreed method of establishing this. We propose principles for deciding saturation in theory-based interview studies (where conceptual categories are pre-established by existing theory). First, specify a minimum sample size for initial analysis (initial analysis sample). Second, specify how many more interviews will be conducted without new ideas emerging (stopping criterion). We demonstrate these principles in two studies, based on the theory of planned behaviour, designed to identify three belief categories (Behavioural, Normative and Control), using an initial analysis sample of 10 and stopping criterion of 3. Study 1 (retrospective analysis of existing data) identified 84 shared beliefs of 14 general medical practitioners about managing patients with sore throat without prescribing antibiotics. The criterion for saturation was achieved for Normative beliefs but not for other beliefs or studywise saturation. In Study 2 (prospective analysis), 17 relatives of people with Paget's disease of the bone reported 44 shared beliefs about taking genetic testing. Studywise data saturation was achieved at interview 17. We propose specification of these principles for reporting data saturation in theory-based interview studies. The principles may be adaptable for other types of studies. PMID:20204937

  19. An automated integration-free path-integral method based on Kleinert's variational perturbation theory

    NASA Astrophysics Data System (ADS)

    Wong, Kin-Yiu; Gao, Jiali

    2007-12-01

    Based on Kleinert's variational perturbation (KP) theory [Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets, 3rd ed. (World Scientific, Singapore, 2004)], we present an analytic path-integral approach for computing the effective centroid potential. The approach enables the KP theory to be applied to any realistic systems beyond the first-order perturbation (i.e., the original Feynman-Kleinert [Phys. Rev. A 34, 5080 (1986)] variational method). Accurate values are obtained for several systems in which exact quantum results are known. Furthermore, the computed kinetic isotope effects for a series of proton transfer reactions, in which the potential energy surfaces are evaluated by density-functional theory, are in good accordance with experiments. We hope that our method could be used by non-path-integral experts or experimentalists as a "black box" for any given system.

  20. Experimental investigation and kinetic-theory-based model of a rapid granular shear flow

    NASA Astrophysics Data System (ADS)

    Wildman, R. D.; Martin, T. W.; Huntley, J. M.; Jenkins, J. T.; Viswanathan, H.; Fen, X.; Parker, D. J.

    An experimental investigation of an idealized rapidly sheared granular flow was performed to test the predictions of a model based on the kinetic theory of dry granular media. Glass ballotini beads were placed in an annular shear cell and the lower boundary rotated to induce a shearing motion in the bed. A single particle was tracked using the positron emission particle tracking (PEPT) technique, a method that determines the location of a particle through the triangulation of gamma photons emitted by a radioactive tracer particle. The packing fraction and velocity fields within the three-dimensional flow were measured and compared to the predictions of a model developed using the conservation and balance equations applicable to dissipative systems, and solved incorporating constitutive relations derived from kinetic theory. The comparison showed that kinetic theory is able to capture the general features of a rapid shear flow reasonably well over a wide range of shear rates and confining pressures.

  1. A comparison of design variables for control theory based airfoil optimization

    NASA Technical Reports Server (NTRS)

    Reuther, James; Jameson, Antony

    1995-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work in the area it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using either the potential flow or the Euler equations with either a conformal mapping or a general coordinate system. We have also explored three-dimensional extensions of these formulations recently. The goal of our present work is to demonstrate the versatility of the control theory approach by designing airfoils using both Hicks-Henne functions and B-spline control points as design variables. The research also demonstrates that the parameterization of the design space is an open question in aerodynamic design.

  2. Robust Stabilization Control Based on Guardian Maps Theory for a Longitudinal Model of Hypersonic Vehicle

    PubMed Central

    Liu, Mengying; Sun, Peihua

    2014-01-01

    A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods. PMID:24795535

  3. Classification of PolSAR image based on quotient space theory

    NASA Astrophysics Data System (ADS)

    An, Zhihui; Yu, Jie; Liu, Xiaomeng; Liu, Limin; Jiao, Shuai; Zhu, Teng; Wang, Shaohua

    2015-12-01

    In order to improve the classification accuracy, quotient space theory was applied in the classification of polarimetric SAR (PolSAR) image. Firstly, Yamaguchi decomposition method is adopted, which can get the polarimetric characteristic of the image. At the same time, Gray level Co-occurrence Matrix (GLCM) and Gabor wavelet are used to get texture feature, respectively. Secondly, combined with texture feature and polarimetric characteristic, Support Vector Machine (SVM) classifier is used for initial classification to establish different granularity spaces. Finally, according to the quotient space granularity synthetic theory, we merge and reason the different quotient spaces to get the comprehensive classification result. Method proposed in this paper is tested with L-band AIRSAR of San Francisco bay. The result shows that the comprehensive classification result based on the theory of quotient space is superior to the classification result of single granularity space.

  4. Robust stabilization control based on guardian maps theory for a longitudinal model of hypersonic vehicle.

    PubMed

    Liu, Yanbin; Liu, Mengying; Sun, Peihua

    2014-01-01

    A typical model of hypersonic vehicle has the complicated dynamics such as the unstable states, the nonminimum phases, and the strong coupling input-output relations. As a result, designing a robust stabilization controller is essential to implement the anticipated tasks. This paper presents a robust stabilization controller based on the guardian maps theory for hypersonic vehicle. First, the guardian maps theories are provided to explain the constraint relations between the open subsets of complex plane and the eigenvalues of the state matrix of closed-loop control system. Then, a general control structure in relation to the guardian maps theories is proposed to achieve the respected design demands. Furthermore, the robust stabilization control law depending on the given general control structure is designed for the longitudinal model of hypersonic vehicle. Finally, a simulation example is provided to verify the effectiveness of the proposed methods. PMID:24795535

  5. Recognizing through feeling. A physical and computer simulation based on educational theory.

    PubMed

    Lyons, J; Milton, J

    1999-01-01

    This article focuses on the educational theory underpinning computer-based simulation in professional education. An innovative computer-based physical simulation to facilitate student learning of assessment and palpation skills in midwifery has been developed to prototype stage and preliminary evaluations conducted. The learning experience explicitly builds on the learning and teaching theory--a "conversational framework"--developed by Laurillard. A template incorporating all the dimensions of the Laurillard framework in the learning experience is presented and discussed. It is argued that this template could have wider application especially in clinically based health science courses. The case-based learning environment allows the students to solve problems and make valid clinical judgments. Throughout the learning experiences, students effectively examine a pregnant woman while interrelating their experiences with the academic knowledge of the teacher (a world of "descriptions"). The structure for learning relies on a mechanism for identifying and addressing the misunderstandings students initially hold. The creation of situations of cognitive conflict in the student's world of action (Laurillard's concept of intrinsic feedback) is seen as central to the learning experience. Finally, the article will canvass the issues faced by a project team designing and developing a technology-based educational package around an educational theory. PMID:10341476

  6. Personality and Psychopathology: a Theory-Based Revision of Eysenck’s PEN Model

    PubMed Central

    van Kampen, Dirk

    2009-01-01

    The principal aim of this paper is to investigate whether it is possible to create a personality taxonomy of clinical relevance out of Eysenck’s original PEN model by repairing the various shortcomings that can be noted in Eysenck’s personality theory, particularly in relation to P or Psychoticism. Addressing three approaches that have been followed to answer the question ‘which personality factors are basic?’, arguments are listed to show that particularly the theory-informed approach, originally defended by Eysenck, may lead to scientific progress. However, also noting the many deficiencies in the nomological network surrounding P, the peculiar situation arises that we adhere to Eysenck’s theory-informed methodology, but criticize his theory. These arguments and criticisms led to the replacement of P by three orthogonal and theory-based factors, Insensitivity (S), Orderliness (G), and Absorption (A), that together with the dimensions E or Extraversion and N or Neuroticism, that were retained from Eysenck’s PEN model, appear to give a comprehensive account of the main vulnerability factors in schizophrenia and affective disorders, as well as in other psychopathological conditions. PMID:20498694

  7. Buckling analysis of functionally graded nanobeams based on a nonlocal third-order shear deformation theory

    NASA Astrophysics Data System (ADS)

    Rahmani, O.; Jandaghian, A. A.

    2015-06-01

    In this paper, a general third-order beam theory that accounts for nanostructure-dependent size effects and two-constituent material variation through the nanobeam thickness, i.e., functionally graded material (FGM) beam is presented. The material properties of FG nanobeams are assumed to vary through the thickness according to the power law. A detailed derivation of the equations of motion based on Eringen nonlocal theory using Hamilton's principle is presented, and a closed-form solution is derived for buckling behavior of the new model with various boundary conditions. The nonlocal elasticity theory includes a material length scale parameter that can capture the size effect in a functionally graded material. The proposed model is efficient in predicting the shear effect in FG nanobeams by applying third-order shear deformation theory. The proposed approach is validated by comparing the obtained results with benchmark results available in the literature. In the following, a parametric study is conducted to investigate the influences of the length scale parameter, gradient index, and length-to-thickness ratio on the buckling of FG nanobeams and the improvement on nonlocal third-order shear deformation theory comparing with the classical (local) beam model has been shown. It is found out that length scale parameter is crucial in studying the stability behavior of the nanobeams.

  8. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission

    PubMed Central

    Kiani, Mehdi; Ghovanloo, Maysam

    2014-01-01

    Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368

  9. Asymmetric Invisibility Cloaking Theory Based on the Concept of Effective Electromagnetic Fields for Photons

    NASA Astrophysics Data System (ADS)

    Amemiya, Tomo; Taki, Masato; Kanazawa, Toru; Arai, Shigehisa

    2014-03-01

    The asymmetric invisibility cloak is a special cloak with unidirectional transparency; that is, a person in the cloak should not be seen from the outside but should be able to see the outside. Existing theories of designing invisibility cloaks cannot be used for asymmetric cloaking because they are based on the transformation optics that uses Riemannian metric tensor independent of direction. To overcome this problem, we propose introducing directionality into invisibility cloaking. Our theory is based on ``the theory of effective magnetic field for photons'' proposed by Stanford University.[2] To realize asymmetric cloaking, we have extended the Stanford's theory to add the concept of ``effective electric field for photons.'' The effective electric and the magnetic field can be generated using a photonc resonator lattice, which is a kind of metamaterial. The Hamiltonian for photons in these fields has a similar form to that of the Hamiltonian for a charged particle in an electromagnetic field. An incident photon therefore experiences a ``Lorentz-like'' and a ``Coulomb-like'' force and shows asymmetric movement depending of its travelling direction.We show the procedure of designing actual invisibility cloaks using the photonc resonator lattice and confirm their operation with the aid of computer simulation. This work was supported in part by the MEXT; JSPS KAKENHI Grant Numbers #24246061, #24656046, #25420321, #25420322.

  10. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission.

    PubMed

    Kiani, Mehdi; Ghovanloo, Maysam

    2012-09-01

    Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368

  11. Simple Models for Airport Delays During Transition to a Trajectory-Based Air Traffic System

    NASA Astrophysics Data System (ADS)

    Brooker, Peter

    It is now widely recognised that a paradigm shift in air traffic control concepts is needed. This requires state-of-the-art innovative technologies, making much better use of the information in the air traffic management (ATM) system. These paradigm shifts go under the names of NextGen in the USA and SESAR in Europe, which inter alia will make dramatic changes to the nature of airport operations. A vital part of moving from an existing system to a new paradigm is the operational implications of the transition process. There would be business incentives for early aircraft fitment, it is generally safer to introduce new technologies gradually, and researchers are already proposing potential transition steps to the new system. Simple queuing theory models are used to establish rough quantitative estimates of the impact of the transition to a more efficient time-based navigational and ATM system. Such models are approximate, but they do offer insight into the broad implications of system change and its significant features. 4D-equipped aircraft in essence have a contract with the airport runway and, in return, they would get priority over any other aircraft waiting for use of the runway. The main operational feature examined here is the queuing delays affecting non-4D-equipped arrivals. These get a reasonable service if the proportion of 4D-equipped aircraft is low, but this can deteriorate markedly for high proportions, and be economically unviable. Preventative measures would be to limit the additional growth of 4D-equipped flights and/or to modify their contracts to provide sufficient space for the non-4D-equipped flights to operate without excessive delays. There is a potential for non-Poisson models, for which there is little in the literature, and for more complex models, e.g. grouping a succession of 4D-equipped aircraft as a batch.

  12. Membrane-Based Characterization of a Gas Component — A Transient Sensor Theory

    PubMed Central

    Lazik, Detlef

    2014-01-01

    Based on a multi-gas solution-diffusion problem for a dense symmetrical membrane this paper presents a transient theory of a planar, membrane-based sensor cell for measuring gas from both initial conditions: dynamic and thermodynamic equilibrium. Using this theory, the ranges for which previously developed, simpler approaches are valid will be discussed; these approaches are of vital interest for membrane-based gas sensor applications. Finally, a new theoretical approach is introduced to identify varying gas components by arranging sensor cell pairs resulting in a concentration independent gas-specific critical time. Literature data for the N2, O2, Ar, CH4, CO2, H2 and C4H10 diffusion coefficients and solubilities for a polydimethylsiloxane membrane were used to simulate gas specific sensor responses. The results demonstrate the influence of (i) the operational mode; (ii) sensor geometry and (iii) gas matrices (air, Ar) on that critical time. Based on the developed theory the case-specific suitable membrane materials can be determined and both operation and design options for these sensors can be optimized for individual applications. The results of mixing experiments for different gases (O2, CO2) in a gas matrix of air confirmed the theoretical predictions. PMID:24608004

  13. Are node-based and stem-based clades equivalent? Insights from graph theory.

    PubMed

    Martin, Jeremy; Blackburn, David; Wiley, E O

    2010-01-01

    Despite the prominence of "tree-thinking" among contemporary systematists and evolutionary biologists, the biological meaning of different mathematical representations of phylogenies may still be muddled. We compare two basic kinds of discrete mathematical models used to portray phylogenetic relationships among species and higher taxa: stem-based trees and node-based trees. Each model is a tree in the sense that is commonly used in mathematics; the difference between them lies in the biological interpretation of their vertices and edges. Stem-based and node-based trees carry exactly the same information and the biological interpretation of each is similar. Translation between these two kinds of trees can be accomplished by a simple algorithm, which we provide. With the mathematical representation of stem-based and node-based trees clarified, we argue for a distinction between types of trees and types of names. Node-based and stem-based trees contain exactly the same information for naming clades. However, evolutionary concepts, such as monophyly, are represented as different mathematical substructures in the two models. For a given stem-based tree, one should employ stem-based names, whereas for a given node-based tree, one should use node-based names, but applying a node-based name to a stem-based tree is not logical because node-based names cannot exist on a stem-based tree and visa versa. Authors might use node-based and stem-based concepts of monophyly for the same representation of a phylogeny, yet, if so, they must recognize that such a representation differs from the graphical models used for computing in phylogenetic systematics. PMID:21113336

  14. Comparison of inlet suppressor data with approximate theory based on cutoff ratio

    NASA Technical Reports Server (NTRS)

    Rice, E. J.; Heidelberg, L. J.

    1979-01-01

    Inlet suppressor far-field directivity suppression was quantitatively compared with that predicted using an approximate linear design and evaluation method based upon mode cutoff ratio. The experimental data was obtained using a series of cylindrical point-reacting inlet liners on a YF102 engine. The theoretical prediction program is based upon simplified sound propagation concepts derived from exact calculations. These indicate that all of the controlling phenomenon can be approximately correlated with mode cutoff ratio which itself is intimately related to the angles of propagation within the duct. The theory-data comparisons are intended to point out possible deficiencies in the approximate theory which may be corrected. After all theoretical refinements are made, then empirical corrections can be applied.

  15. Comparison of inlet suppressor data with approximate theory based on cutoff ratio

    NASA Technical Reports Server (NTRS)

    Rice, E. J.; Heidelberg, L. J.

    1980-01-01

    This paper represents the initial quantitative comparison of inlet suppressor far-field directivity suppression with that predicted using an approximate liner design and evaluation method based upon mode cutoff ratio. The experimental data was obtained using a series of cylindrical point-reacting inlet liners on an Avco-Lycoming YF102 engine. The theoretical prediction program is based upon simplified sound propagation concepts derived from exact calculations. These indicate that all of the controlling phenomenon can be approximately correlated with mode cutoff ratio which itself is intimately related to the angles of propagation within the duct. The objective of the theory-data comparisons is to point out possible deficiencies in the approximate theory which may be corrected. After all theoretical refinements have been made, then empirical corrections can be applied.

  16. A theory-based logic model for innovation policy and evaluation.

    SciTech Connect

    Jordan, Gretchen B.

    2010-04-01

    Current policy and program rationale, objectives, and evaluation use a fragmented picture of the innovation process. This presents a challenge since in the United States officials in both the executive and legislative branches of government see innovation, whether that be new products or processes or business models, as the solution to many of the problems the country faces. The logic model is a popular tool for developing and describing the rationale for a policy or program and its context. This article sets out to describe generic logic models of both the R&D process and the diffusion process, building on existing theory-based frameworks. Then a combined, theory-based logic model for the innovation process is presented. Examples of the elements of the logic, each a possible leverage point or intervention, are provided, along with a discussion of how this comprehensive but simple model might be useful for both evaluation and policy development.

  17. Design of Flexure-based Precision Transmission Mechanisms using Screw Theory

    SciTech Connect

    Hopkins, J B; Panas, R M

    2011-02-07

    This paper enables the synthesis of flexure-based transmission mechanisms that possess multiple decoupled inputs and outputs of any type (e.g. rotations, translations, and/or screw motions), which are linked by designer-specified transmission ratios. A comprehensive library of geometric shapes is utilized from which every feasible concept that possesses the desired transmission characteristics may be rapidly conceptualized and compared before an optimal concept is selected. These geometric shapes represent the rigorous mathematics of screw theory and uniquely link a body's desired motions to the flexible constraints that enable those motions. This paper's impact is most significant to the design of nano-positioners, microscopy stages, optical mounts, and sensors. A flexure-based microscopy stage was designed, fabricated, and tested to demonstrate the utility of the theory.

  18. Finding theory- and evidence-based alternatives to fear appeals: Intervention Mapping

    PubMed Central

    Kok, Gerjo; Bartholomew, L Kay; Parcel, Guy S; Gottlieb, Nell H; Fernández, María E

    2014-01-01

    Fear arousal—vividly showing people the negative health consequences of life-endangering behaviors—is popular as a method to raise awareness of risk behaviors and to change them into health-promoting behaviors. However, most data suggest that, under conditions of low efficacy, the resulting reaction will be defensive. Instead of applying fear appeals, health promoters should identify effective alternatives to fear arousal by carefully developing theory- and evidence-based programs. The Intervention Mapping (IM) protocol helps program planners to optimize chances for effectiveness. IM describes the intervention development process in six steps: (1) assessing the problem and community capacities, (2) specifying program objectives, (3) selecting theory-based intervention methods and practical applications, (4) designing and organizing the program, (5) planning, adoption, and implementation, and (6) developing an evaluation plan. Authors who used IM indicated that it helped in bringing the development of interventions to a higher level. PMID:24811880

  19. Eight myths on motivating social services workers: theory-based perspectives.

    PubMed

    Latting, J K

    1991-01-01

    A combination of factors has made formal motivational and reward systems rare in human service organizations generally and virtually non-existent in social service agencies. The author reviews eight of these myths by reference to eight motivational theories which refute them: need theory, expectancy theory, feedback theory, equity theory, reinforcement theory, cognitive evaluation theory, goal setting theory, and social influence theory. Although most of these theories have been developed and applied in the private sector, relevant research has also been conducted in social service agencies. The author concludes with a summary of guidelines suggested by the eight theories for motivating human service workers. PMID:10114292

  20. Design of “Magnetic Resonance Type” WPT Systems Based on Filter Theory

    NASA Astrophysics Data System (ADS)

    Awai, Ikuo; Ishizaki, Toshio

    A wireless power transfer system that is made of magnetically coupled resonators is designed for 0Ω power source. The design is based on the BPF theory, which insists more restrictive condition than the conventional power factor condition. But it gives wider operating bandwidth and higher power transfer efficiency. The present paper derives the condition analytically and shows some useful design examples, including 2-stage as wall as 3-stage system, and a system with a loop coil to transform the load impedance.

  1. Ultrasonic Field Computation into Multilayered Composite Materials Using a Homogenization Method Based on Ray Theory

    NASA Astrophysics Data System (ADS)

    Deydier, S.; Gengembre, N.; Calmon, P.; Mengeling, V.; Pétillon, O.

    2005-04-01

    The simulation of ultrasonic NDT of carbon-fiber-reinforced epoxy composites (CFRP) is an important challenge for the aircraft industry. In a previous article, we proposed to evaluate the field radiated into such components by means of a homogenization method coupled to the pencil model implemented in CIVA software. Following the same goals, an improvement is proposed here through the development of an original homogenization procedure based on ray theory.

  2. A fast algorithm for attribute reduction based on Trie tree and rough set theory

    NASA Astrophysics Data System (ADS)

    Hu, Feng; Wang, Xiao-yan; Luo, Chuan-jiang

    2013-03-01

    Attribute reduction is an important issue in rough set theory. Many efficient algorithms have been proposed, however, few of them can process huge data sets quickly. In this paper, combining the Trie tree, the algorithms for computing positive region of decision table are proposed. After that, a new algorithm for attribute reduction based on Trie tree is developed, which can be used to process the attribute reduction of large data sets quickly. Experiment results show its high efficiency.

  3. Theory of plasticity based on a new invariant of stress tensor. Two-dimensional stress

    NASA Astrophysics Data System (ADS)

    Revuzhenko, A. F.; Mikenina, O. A.

    2015-10-01

    The authors introduce a new stress tensor invariant proportional to a squared intensity of shear stresses versus maximum shear stress. The invariant means a shear stress averaged over three fans-areas along three principal stresses of the stress tensor. The theory is based on the invariant and the associated flow rule. The article gives equations of generalized two-dimensional stress state and an analysis of their types. The authors solve an axisymmetrical problem on limit state around a hole.

  4. Cross-media Translation Based on the Mental Image Directed Semantic Theory

    NASA Astrophysics Data System (ADS)

    Hironaka, Daisuke; Yokota, Masao

    In general, it is not always easy for people to communicate each other comprehensively by limited information media. In such a case, employment of another information medium is very helpful and therefore cross-media translation is very important during such a communication. This paper presents the method and experiment of cross-media translation based on MIDST(Mental Image Directed Semantic Theory), where natural language texts about static positional relations of physical objects are systematically interpreted into 2-D pictures.

  5. Predicting magnetorheological fluid flow behavior using a multiscale kinetic theory-based model

    NASA Astrophysics Data System (ADS)

    Mahboob, Monon; Ahmadkhanlou, Farzad; Kagarise, Christopher; Washington, Gregory; Bechtel, Stephen; Koelling, Kurt

    2009-03-01

    Magnetorheological (MR) fluids have rheological properties, such as the viscosity and yield stress that can be altered by an external magnetic field. The design of novel devices utilizing the MR fluid behavior in multi-degree of freedom applications require three dimensional models characterizing the coupling of magnetic behavior to mechanical behavior in MR fluids. A 3-D MR fluid model based on multiscale kinetic theory is presented. The kinetic theory-based model relates macroscale MR fluid behavior to a first-principle description of magnetomechanical coupling at the microscale. A constitutive relation is also proposed that accounts for the various forces transmitted through the fluid. This model accounts for the viscous drag on the spherical particles as well as Brownian forces. Interparticle forces due to magnetization and external magnetic forces applied to ferrous particles are considered. The tunable rheological properties of the MR fluids are studied using a MR rheological instrument. High and low viscosity carrier fluids along with small and large carbonyl iron particles are used to make and study the behavior of four different MR fluids. Experiments measuring steady, and dynamic oscillatory shear response under a range of magnetic field strengths are performed. The rheological properties of the MR fluid samples are investigated and compared to the proposed kinetic theory-based model. The storage (G') and loss (G") moduli of the MR fluids are studied as well.

  6. Theory of plasma contactors in ground-based experiments and low Earth orbit

    NASA Technical Reports Server (NTRS)

    Gerver, M. J.; Hastings, Daniel E.; Oberhardt, M. R.

    1990-01-01

    Previous theoretical work on plasma contactors as current collectors has fallen into two categories: collisionless double layer theory (describing space charge limited contactor clouds) and collisional quasineutral theory. Ground based experiments at low current are well explained by double layer theory, but this theory does not scale well to power generation by electrodynamic tethers in space, since very high anode potentials are needed to draw a substantial ambient electron current across the magnetic field in the absence of collisions (or effective collisions due to turbulence). Isotropic quasineutral models of contactor clouds, extending over a region where the effective collision frequency upsilon sub e exceeds the electron cyclotron frequency omega sub ce, have low anode potentials, but would collect very little ambient electron current, much less than the emitted ion current. A new model is presented, for an anisotropic contactor cloud oriented along the magnetic field, with upsilon sub e less than omega sub ce. The electron motion along the magnetic field is nearly collisionless, forming double layers in that direction, while across the magnetic field the electrons diffuse collisionally and the potential profile is determined by quasineutrality. Using a simplified expression for upsilon sub e due to ion acoustic turbulence, an analytic solution has been found for this model, which should be applicable to current collection in space. The anode potential is low and the collected ambient electron current can be several times the emitted ion current.

  7. Adapting SAFT-? perturbation theory to site-based molecular dynamics simulation. I. Homogeneous fluids.

    PubMed

    Ghobadi, Ahmadreza F; Elliott, J Richard

    2013-12-21

    In this work, we aim to develop a version of the Statistical Associating Fluid Theory (SAFT)-? equation of state (EOS) that is compatible with united-atom force fields, rather than experimental data. We rely on the accuracy of the force fields to provide the relation to experimental data. Although, our objective is a transferable theory of interfacial properties for soft and fused heteronuclear chains, we first clarify the details of the SAFT-? approach in terms of site-based simulations for homogeneous fluids. We show that a direct comparison of Helmholtz free energy to molecular simulation, in the framework of a third order Weeks-Chandler-Andersen perturbation theory, leads to an EOS that takes force field parameters as input and reproduces simulation results for Vapor-Liquid Equilibria (VLE) calculations. For example, saturated liquid density and vapor pressure of n-alkanes ranging from methane to dodecane deviate from those of the Transferable Potential for Phase Equilibria (TraPPE) force field by about 0.8% and 4%, respectively. Similar agreement between simulation and theory is obtained for critical properties and second virial coefficient. The EOS also reproduces simulation data of mixtures with about 5% deviation in bubble point pressure. Extension to inhomogeneous systems and united-atom site types beyond those used in description of n-alkanes will be addressed in succeeding papers. PMID:24359349

  8. Adapting SAFT-γ perturbation theory to site-based molecular dynamics simulation. I. Homogeneous fluids

    SciTech Connect

    Ghobadi, Ahmadreza F.; Elliott, J. Richard

    2013-12-21

    In this work, we aim to develop a version of the Statistical Associating Fluid Theory (SAFT)-γ equation of state (EOS) that is compatible with united-atom force fields, rather than experimental data. We rely on the accuracy of the force fields to provide the relation to experimental data. Although, our objective is a transferable theory of interfacial properties for soft and fused heteronuclear chains, we first clarify the details of the SAFT-γ approach in terms of site-based simulations for homogeneous fluids. We show that a direct comparison of Helmholtz free energy to molecular simulation, in the framework of a third order Weeks-Chandler-Andersen perturbation theory, leads to an EOS that takes force field parameters as input and reproduces simulation results for Vapor-Liquid Equilibria (VLE) calculations. For example, saturated liquid density and vapor pressure of n-alkanes ranging from methane to dodecane deviate from those of the Transferable Potential for Phase Equilibria (TraPPE) force field by about 0.8% and 4%, respectively. Similar agreement between simulation and theory is obtained for critical properties and second virial coefficient. The EOS also reproduces simulation data of mixtures with about 5% deviation in bubble point pressure. Extension to inhomogeneous systems and united-atom site types beyond those used in description of n-alkanes will be addressed in succeeding papers.

  9. Coherent reverberation model based on adiabatic normal mode theory in a range dependent shallow water environment

    NASA Astrophysics Data System (ADS)

    Li, Zhenglin; Zhang, Renhe; Li, Fenghua

    2010-09-01

    Ocean reverberation in shallow water is often the predominant background interference in active sonar applications. It is still an open problem in underwater acoustics. In recent years, an oscillation phenomenon of the reverberation intensity, due to the interference of the normal modes, has been observed in many experiments. A coherent reverberation theory has been developed and used to explain this oscillation phenomenon [F. Li et al., Journal of Sound and Vibration, 252(3), 457-468, 2002]. However, the published coherent reverberation theory is for the range independent environment. Following the derivations by F. Li and Ellis [D. D. Ellis, J. Acoust. Soc. Am., 97(5), 2804-2814, 1995], a general reverberation model based on the adiabatic normal mode theory in a range dependent shallow water environment is presented. From this theory the coherent or incoherent reverberation field caused by sediment inhomogeneity and surface roughness can be predicted. Observations of reverberation from the 2001 Asian Sea International Acoustic Experiment (ASIAEX) in the East China Sea are used to test the model. Model/data comparison shows that the coherent reverberation model can predict the experimental oscillation phenomenon of reverberation intensity and the vertical correlation of reverberation very well.

  10. A Research of Weapon System Storage Reliability Simulation Method Based on Fuzzy Theory

    NASA Astrophysics Data System (ADS)

    Shi, Yonggang; Wu, Xuguang; Chen, Haijian; Xu, Tingxue

    Aimed at the problem of the new, complicated weapon equipment system storage reliability analyze, the paper researched on the methods of fuzzy fault tree analysis and fuzzy system storage reliability simulation, discussed the path that regarded weapon system as fuzzy system, and researched the storage reliability of weapon system based on fuzzy theory, provided a method of storage reliability research for the new, complicated weapon equipment system. As an example, built up the fuzzy fault tree of one type missile control instrument based on function analysis, and used the method of fuzzy system storage reliability simulation to analyze storage reliability index of control instrument.

  11. Designing a Project-based Learning in a University with New Theory of Learning

    NASA Astrophysics Data System (ADS)

    Mima, Noyuri

    New theory of learning indicates “learning” is the process of interaction which occurs in social relationships within a community containing a multitude of things beyond any single individual. From this point of view, the “project-based learning” is one of new methods of teaching and learning at university. The method of project-based learning includes of team learning, team teaching, portfolio assessment, open space, and faculty development. This paper discusses potential of university to become a learning community with the method along with results of the educational practice at Future University-Hakodate.

  12. An elastodynamic discrete laminated plate theory based on an assumed through-the-thickness stress distribution

    NASA Astrophysics Data System (ADS)

    Schoeppner, G. A.; Wolfe, W. E.; Sandhu, R. S.

    1993-04-01

    A static laminated plate theory based on an assumed piecewise linear through-the-thickness in-plane stress distribution has been extended to include inertia effects. Based on this in-plane stress distribution assumption, out-of-plane shear and normal stress component distributions were derived from the three-dimensional equations of motion, resulting in six non-zero stress components. Hamilton's variational principle was used to derive the plate equations of motion, the plate constitutive relationships and the interface continuity equations.

  13. Web-Based Learning Environment: A Theory-Based Design Process for Development and Evaluation

    ERIC Educational Resources Information Center

    Nam, Chang S.; Smith-Jackson, Tonya L.

    2007-01-01

    Web-based courses and programs have increasingly been developed by many academic institutions, organizations, and companies worldwide due to their benefits for both learners and educators. However, many of the developmental approaches lack two important considerations needed for implementing Web-based learning applications: (1) integration of the…

  14. Testing a Theory-Based Mobility Monitoring Protocol Using In-Home Sensors: A Feasibility Study

    PubMed Central

    Reeder, Blaine; Chung, Jane; Lazar, Amanda; Joe, Jonathan; Demiris, George; Thompson, Hilaire J.

    2014-01-01

    Mobility is a key factor in the performance of many everyday tasks required for independent living as a person grows older. The purpose of this mixed methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assessing the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3 month and 6 month visits (examples: FES, GDS-SF, Mini-cog). Semi-structured interviews to characterize acceptability of the technology were conducted at 3 month and 6 month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation. PMID:23938159

  15. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    NASA Astrophysics Data System (ADS)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  16. Optimal control theory for quantum-classical systems: Ehrenfest molecular dynamics based on time-dependent density-functional theory

    NASA Astrophysics Data System (ADS)

    Castro, A.; Gross, E. K. U.

    2014-01-01

    We derive the fundamental equations of an optimal control theory for systems containing both quantum electrons and classical ions. The system is modeled with Ehrenfest dynamics, a non-adiabatic variant of molecular dynamics. The general formulation, that needs the fully correlated many-electron wavefunction, can be simplified by making use of time-dependent density-functional theory. In this case, the optimal control equations require some modifications that we will provide. The abstract general formulation is complemented with the simple example of the H_2^+ molecule in the presence of a laser field.

  17. Promoting fruit and vegetable consumption. Testing an intervention based on the theory of planned behaviour.

    PubMed

    Kothe, E J; Mullan, B A; Butow, P

    2012-06-01

    This study evaluated the efficacy of a theory of planned behaviour (TPB) based intervention to increase fruit and vegetable consumption. The extent to which fruit and vegetable consumption and change in intake could be explained by the TPB was also examined. Participants were randomly assigned to two levels of intervention frequency matched for intervention content (low frequency n=92, high frequency n=102). Participants received TPB-based email messages designed to increase fruit and vegetable consumption, messages targeted attitude, subjective norm and perceived behavioural control (PBC). Baseline and post-intervention measures of TPB variables and behaviour were collected. Across the entire study cohort, fruit and vegetable consumption increased by 0.83 servings/day between baseline and follow-up. Intention, attitude, subjective norm and PBC also increased (p<.05). The TPB successfully modelled fruit and vegetable consumption at both time points but not behaviour change. The increase of fruit and vegetable consumption is a promising preliminary finding for those primarily interested in increasing fruit and vegetable consumption. However, those interested in theory development may have concerns about the use of this model to explain behaviour change in this context. More high quality experimental tests of the theory are needed to confirm this result. PMID:22349778

  18. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  19. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  20. Slender-Body Theory Based On Approximate Solution of the Transonic Flow Equation

    NASA Technical Reports Server (NTRS)

    Spreiter, John R.; Alksne, Alberta Y.

    1959-01-01

    Approximate solution of the nonlinear equations of the small disturbance theory of transonic flow are found for the pressure distribution on pointed slender bodies of revolution for flows with free-stream, Mach number 1, and for flows that are either purely subsonic or purely supersonic. These results are obtained by application of a method based on local linearization that was introduced recently in the analysis of similar problems in two-dimensional flows. The theory is developed for bodies of arbitrary shape, and specific results are given for cone-cylinders and for parabolic-arc bodies at zero angle of attack. All results are compared either with existing theoretical results or with experimental data.

  1. Formula for the rms blur circle radius of Wolter telescope based on aberration theory

    NASA Technical Reports Server (NTRS)

    Shealy, David L.; Saha, Timo T.

    1990-01-01

    A formula for the rms blur circle for Wolter telescopes has been derived using the transverse ray aberration expressions of Saha (1985), Saha (1984), and Saha (1986). The resulting formula for the rms blur circle radius over an image plane and a formula for the surface of best focus based on third-, fifth-, and seventh-order aberration theory predict results in good agreement with exact ray tracing. It has also been shown that one of the two terms in the empirical formula of VanSpeybroeck and Chase (1972), for the rms blur circle radius of a Wolter I telescope can be justified by the aberration theory results. Numerical results are given comparing the rms blur radius and the surface of best focus vs the half-field angle computed by skew ray tracing and from analytical formulas for grazing incidence Wolter I-II telescopes and a normal incidence Cassegrain telescope.

  2. Dynamically Incremental K-means++ Clustering Algorithm Based on Fuzzy Rough Set Theory

    NASA Astrophysics Data System (ADS)

    Li, Wei; Wang, Rujing; Jia, Xiufang; Jiang, Qing

    Being classic K-means++ clustering algorithm only for static data, dynamically incremental K-means++ clustering algorithm (DK-Means++) is presented based on fuzzy rough set theory in this paper. Firstly, in DK-Means++ clustering algorithm, the formula of similar degree is improved by weights computed by using of the important degree of attributes which are reduced on the basis of rough fuzzy set theory. Secondly, new data only need match granular which was clustered by K-means++ algorithm or seldom new data is clustered by classic K-means++ algorithm in global data. In this way, that all data is re-clustered each time in dynamic data set is avoided, so the efficiency of clustering is improved. Throughout our experiments showing, DK-Means++ algorithm can objectively and efficiently deal with clustering problem of dynamically incremental data.

  3. Social judgment theory based model on opinion formation, polarization and evolution

    NASA Astrophysics Data System (ADS)

    Chau, H. F.; Wong, C. Y.; Chow, F. K.; Fung, Chi-Hang Fred

    2014-12-01

    The dynamical origin of opinion polarization in the real world is an interesting topic that physical scientists may help to understand. To properly model the dynamics, the theory must be fully compatible with findings by social psychologists on microscopic opinion change. Here we introduce a generic model of opinion formation with homogeneous agents based on the well-known social judgment theory in social psychology by extending a similar model proposed by Jager and Amblard. The agents’ opinions will eventually cluster around extreme and/or moderate opinions forming three phases in a two-dimensional parameter space that describes the microscopic opinion response of the agents. The dynamics of this model can be qualitatively understood by mean-field analysis. More importantly, first-order phase transition in opinion distribution is observed by evolving the system under a slow change in the system parameters, showing that punctuated equilibria in public opinion can occur even in a fully connected social network.

  4. Free vibration analysis of size-dependent cracked microbeam based on the modified couple stress theory

    NASA Astrophysics Data System (ADS)

    Sourki, R.; Hoseini, S. A. H.

    2016-04-01

    This paper investigates the analysis for free transverse vibration of a cracked microbeam based on the modified couple stress theory within the framework of Euler-Bernoulli beam theory. The governing equation and the related boundary conditions are derived by using Hamilton's principle. The cracked beam is modeled by dividing the beam into two segments connected by a rotational spring located at the cracked section. This model invokes the consideration of the additional strain energy caused by the crack and promotes a discontinuity in the bending slope. In this investigation, the influence of diverse crack position, crack severity, material length scale parameter as well as various Poisson's ratio on natural frequencies is studied. A comparison with the previously published studies is made, in which a good agreement is observed. The results illustrate that the aforementioned parameters are playing a significant role on the dynamic behavior of the microbeam.

  5. An English Vocabulary Learning System Based on Fuzzy Theory and Memory Cycle

    NASA Astrophysics Data System (ADS)

    Wang, Tzone I.; Chiu, Ti Kai; Huang, Liang Jun; Fu, Ru Xuan; Hsieh, Tung-Cheng

    This paper proposes an English Vocabulary Learning System based on the Fuzzy Theory and the Memory Cycle Theory to help a learner to memorize vocabularies easily. By using fuzzy inferences and personal memory cycles, it is possible to find an article that best suits a learner. After reading an article, a quiz is provided for the learner to improve his/her memory of the vocabulary in the article. Early researches use just explicit response (ex. quiz exam) to update memory cycles of newly learned vocabulary; apart from that approach, this paper proposes a methodology that also modify implicitly the memory cycles of learned word. By intensive reading of articles recommended by our approach, a learner learns new words quickly and reviews learned words implicitly as well, and by which the vocabulary ability of the learner improves efficiently.

  6. Characterization of degeneration process in combustion instability based on dynamical systems theory

    NASA Astrophysics Data System (ADS)

    Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru

    2015-11-01

    We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011), 10.1063/1.3563577; Chaos 22, 043128 (2012), 10.1063/1.4766589]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics.

  7. A 3-D elasticity theory based model for acoustic radiation from multilayered anisotropic plates.

    PubMed

    Shen, C; Xin, F X; Lu, T J

    2014-05-01

    A theoretical model built upon three-dimensional elasticity theory is developed to investigate the acoustic radiation from multilayered anisotropic plates subjected to a harmonic point force excitation. Fourier transform technique and stationary phase method are combined to predict the far-field radiated sound pressure of one-side water immersed plate. Compared to equivalent single-layer plate models, the present model based on elasticity theory can differentiate radiated sound pressure between dry-side and wet-side excited cases, as well as discrepancies induced by different layer sequences for multilayered anisotropic plates. These results highlight the superiority of the present theoretical model especially for handling multilayered anisotropic structures. PMID:24815294

  8. Characterization of degeneration process in combustion instability based on dynamical systems theory.

    PubMed

    Gotoda, Hiroshi; Okuno, Yuta; Hayashi, Kenta; Tachibana, Shigeru

    2015-11-01

    We present a detailed study on the characterization of the degeneration process in combustion instability based on dynamical systems theory. We deal with combustion instability in a lean premixed-type gas-turbine model combustor, one of the fundamentally and practically important combustion systems. The dynamic behavior of combustion instability in close proximity to lean blowout is dominated by a stochastic process and transits to periodic oscillations created by thermoacoustic combustion oscillations via chaos with increasing equivalence ratio [Chaos 21, 013124 (2011); Chaos 22, 043128 (2012)]. Thermoacoustic combustion oscillations degenerate with a further increase in the equivalence ratio, and the dynamic behavior leads to chaotic fluctuations via quasiperiodic oscillations. The concept of dynamical systems theory presented here allows us to clarify the nonlinear characteristics hidden in complex combustion dynamics. PMID:26651761

  9. Research on three-dimensional computer-generated holographic algorithm based on conformal geometry theory

    NASA Astrophysics Data System (ADS)

    Zhang, Yaping; Zhang, Jianqiang; Chen, Wei; Zhang, Jialing; Wang, Peng; Xu, Wei

    2013-11-01

    A novel method of three-dimensional computer-generated holographic algorithm based on conformal geometry theory was introduced. Firstly, three-dimensional object is transformed to a corresponding two-dimensional image via conformal transformation by making use of conformal geometry theory in mathematics, and then the Fresnel computer-generated hologram of the two-dimensional image is produced. During the reconstruction process, the three-dimensional reconstruction image of the original object is obtained by means of conformal coordinate transformation. The research results show that the algorithm introduced in this paper is characterized by fast running speed and good effects, providing a good reference for the research on three-dimensional computer-generated holographic algorithm.

  10. Theory-based approaches to understanding public emergency preparedness: implications for effective health and risk communication.

    PubMed

    Paek, Hye-Jin; Hilyard, Karen; Freimuth, Vicki; Barge, J Kevin; Mindlin, Michele

    2010-06-01

    Recent natural and human-caused disasters have awakened public health officials to the importance of emergency preparedness. Guided by health behavior and media effects theories, the analysis of a statewide survey in Georgia reveals that self-efficacy, subjective norm, and emergency news exposure are positively associated with the respondents' possession of emergency items and their stages of emergency preparedness. Practical implications suggest less focus on demographics as the sole predictor of emergency preparedness and more comprehensive measures of preparedness, including both a person's cognitive stage of preparedness and checklists of emergency items on hand. We highlight the utility of theory-based approaches for understanding and predicting public emergency preparedness as a way to enable more effective health and risk communication. PMID:20574880

  11. Control theory based airfoil design for potential flow and a finite volume discretization

    NASA Technical Reports Server (NTRS)

    Reuther, J.; Jameson, A.

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.

  12. [Mathematical exploration of essence of herbal properties based on "Three-Elements" theory].

    PubMed

    Jin, Rui; Zhao, Qian; Zhang, Bing

    2014-10-01

    Herbal property theory of traditional Chinese medicines is the theoretical guidance on authentication of medicinal plants, herborization, preparation of herbal medicines for decoction and clinical application, with important theoretical value and prac- tical significance. Our research team proposed the "three-element" theory for herbal properties for the first time, conducted a study by using combined methods of philology, chemistry, pharmacology and mathematics, and then drew the research conclusion that herbal properties are defined as the chemical compositions-based comprehensive expression with complex and multi-level (positive/negative) biological effects in specific organism state. In this paper, researchers made a systematic mathematical analysis in four aspects--the correlation between herbal properties and chemical component factors, the correlation between herbal properties and organism state fac- tor, the correlation between herbal properties and biological effect factor and the integration study of the three elements, proposed future outlook, and provided reference to mathematical studies and mathematical analysis of herbal properties. PMID:25751963

  13. A continuum thermal stress theory for crystals based on interatomic potentials

    NASA Astrophysics Data System (ADS)

    Liu, XiaoLei; Tang, QiHeng; Wang, TzuChiang

    2014-01-01

    This paper presents a new continuum thermal stress theory for crystals based on interatomic potentials. The effect of finite temperature is taken into account via a harmonic model. An EAM potential for copper is adopted in this paper and verified by computing the effect of the temperature on the specific heat, coefficient of thermal expansion and lattice constant. Then we calculate the elastic constants of copper at finite temperature. The calculation results are in good agreement with experimental data. The thermal stress theory is applied to an anisotropic crystal graphite, in which the Brenner potential is employed. Temperature dependence of the thermodynamic properties, lattice constants and thermal strains for graphite is calculated. The calculation results are also in good agreement with experimental data.

  14. A second order analytical atmospheric drag theory based on the TD88 thermospheric density model

    NASA Astrophysics Data System (ADS)

    El-Salam, F. A. Abd.; Sehnal, L.

    2004-11-01

    A second order atmospheric drag theory based on the usage of TD88 model is constructed. It is developed to the second order in terms of TD88 small parameters K n, j . The short periodic perturbations, of all orbital elements, are evaluated. The secular perturbations of the semi-major axis and of the eccentricity are obtained. The theory is applied to determine the lifetime of the satellites ROHINI (1980 62A), and to predict the lifetime of the microsatellite MIMOSA. The secular perturbations of the nodal longitude and of the argument of perigee due to the Earth’s gravity are taken into account up to the second order in Earth’s oblateness.

  15. Scale effects on information theory-based measures applied to streamflow patterns in two rural watersheds

    NASA Astrophysics Data System (ADS)

    Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.

    2012-01-01

    SummaryUnderstanding streamflow patterns in space and time is important for improving flood and drought forecasting, water resources management, and predictions of ecological changes. Objectives of this work include (a) to characterize the spatial and temporal patterns of streamflow using information theory-based measures at two thoroughly-monitored agricultural watersheds located in different hydroclimatic zones with similar land use, and (b) to elucidate and quantify temporal and spatial scale effects on those measures. We selected two USDA experimental watersheds to serve as case study examples, including the Little River experimental watershed (LREW) in Tifton, Georgia and the Sleepers River experimental watershed (SREW) in North Danville, Vermont. Both watersheds possess several nested sub-watersheds and more than 30 years of continuous data records of precipitation and streamflow. Information content measures (metric entropy and mean information gain) and complexity measures (effective measure complexity and fluctuation complexity) were computed based on the binary encoding of 5-year streamflow and precipitation time series data. We quantified patterns of streamflow using probabilities of joint or sequential appearances of the binary symbol sequences. Results of our analysis illustrate that information content measures of streamflow time series are much smaller than those for precipitation data, and the streamflow data also exhibit higher complexity, suggesting that the watersheds effectively act as filters of the precipitation information that leads to the observed additional complexity in streamflow measures. Correlation coefficients between the information-theory-based measures and time intervals are close to 0.9, demonstrating the significance of temporal scale effects on streamflow patterns. Moderate spatial scale effects on streamflow patterns are observed with absolute values of correlation coefficients between the measures and sub-watershed area varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.

  16. An Implicational View of Self-Healing and Personality Change Based on Gendlin's Theory of Experiencing.

    ERIC Educational Resources Information Center

    Bohart, Arthur C.

    There is relatively little theory on how psychotherapy clients self-heal since most theories of therapy stress the magic of the therapist's interventions. Of the theories that exist, this paper briefly discusses Carl Rogers' theory of self-actualization; and the dialectical theories of Greenberg and his colleagues, Jenkins, and Rychlak. Gendlin's…

  17. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

    ERIC Educational Resources Information Center

    Fukuhara, Hirotaka; Kamata, Akihito

    2011-01-01

    A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…

  18. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    SciTech Connect

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  19. An Alienation-Based Framework for Student Experience in Higher Education: New Interpretations of Past Observations in Student Learning Theory

    ERIC Educational Resources Information Center

    Barnhardt, Bradford; Ginns, Paul

    2014-01-01

    This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state

  20. Simulating Single Word Processing in the Classic Aphasia Syndromes Based on the Wernicke-Lichtheim-Geschwind Theory

    ERIC Educational Resources Information Center

    Weems, Scott A.; Reggia, James A.

    2006-01-01

    The Wernicke-Lichtheim-Geschwind (WLG) theory of the neurobiological basis of language is of great historical importance, and it continues to exert a substantial influence on most contemporary theories of language in spite of its widely recognized limitations. Here, we suggest that neurobiologically grounded computational models based on the WLG

  1. Motivational Measure of the Instruction Compared: Instruction Based on the ARCS Motivation Theory vs Traditional Instruction in Blended Courses

    ERIC Educational Resources Information Center

    Colakoglu, Ozgur M.; Akdemir, Omur

    2012-01-01

    The ARCS Motivation Theory was proposed to guide instructional designers and teachers who develop their own instruction to integrate motivational design strategies into the instruction. There is a lack of literature supporting the idea that instruction for blended courses if designed based on the ARCS Motivation Theory provides different…

  2. Evaluating Art Studio Courses at Sultan Qaboos University in Light of the Discipline Based Art Education Theory

    ERIC Educational Resources Information Center

    Al-Amri, Mohammed

    2010-01-01

    Discipline-Based Art Education (DBAE), a theory developed in the USA, has been influential but also used in Art Education institutions world-wide. One of its stated goals was to develop the quality of teaching art education. Today, it is used as a theory for identifying and assessing good practices in the field of Art Education. The purpose of…

  3. How Does an Activity Theory Model Help to Know Better about Teaching with Electronic-Exercise-Bases?

    ERIC Educational Resources Information Center

    Abboud-Blanchard, Maha; Cazes, Claire

    2012-01-01

    The research presented in this paper relies on Activity Theory and particularly on Engestrom's model, to better understand the use of Electronic-Exercise-Bases (EEB) by mathematics teachers. This theory provides a holistic approach to illustrate the complexity of the EEB integration. The results highlight reasons and ways of using EEB and show…

  4. The Effect of Training in Gifted Education on Elementary Classroom Teachers' Theory-Based Reasoning about the Concept of Giftedness

    ERIC Educational Resources Information Center

    Miller, Erin Morris

    2009-01-01

    Classroom teachers play an important role in the identification of gifted students through teacher recommendations and referrals. This study is an investigation of teachers' theories of giftedness using methods adapted from those used to study theory-based reasoning in categorization research. In general, the teachers in this study focused on…

  5. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

    ERIC Educational Resources Information Center

    Fukuhara, Hirotaka; Kamata, Akihito

    2011-01-01

    A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into

  6. The Conceptual Mechanism for Viable Organizational Learning Based on Complex System Theory and the Viable System Model

    ERIC Educational Resources Information Center

    Sung, Dia; You, Yeongmahn; Song, Ji Hoon

    2008-01-01

    The purpose of this research is to explore the possibility of viable learning organizations based on identifying viable organizational learning mechanisms. Two theoretical foundations, complex system theory and viable system theory, have been integrated to provide the rationale for building the sustainable organizational learning mechanism. The…

  7. An Alienation-Based Framework for Student Experience in Higher Education: New Interpretations of Past Observations in Student Learning Theory

    ERIC Educational Resources Information Center

    Barnhardt, Bradford; Ginns, Paul

    2014-01-01

    This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state…

  8. Volume Averaging Theory (VAT) based modeling and closure evaluation for fin-and-tube heat exchangers

    NASA Astrophysics Data System (ADS)

    Zhou, Feng; Catton, Ivan

    2012-10-01

    A fin-and-tube heat exchanger was modeled based on Volume Averaging Theory (VAT) in such a way that the details of the original structure was replaced by their averaged counterparts, so that the VAT based governing equations can be efficiently solved for a wide range of parameters. To complete the VAT based model, proper closure is needed, which is related to a local friction factor and a heat transfer coefficient of a Representative Elementary Volume (REV). The terms in the closure expressions are complex and sometimes relating experimental data to the closure terms is difficult. In this work we use CFD to evaluate the rigorously derived closure terms over one of the selected REVs. The objective is to show how heat exchangers can be modeled as a porous media and how CFD can be used in place of a detailed, often formidable, experimental effort to obtain closure for the model.

  9. Coal bed 3D modeling based on set theory and unparallel ATP

    NASA Astrophysics Data System (ADS)

    Zhu, Qingwei

    2008-10-01

    Although spatial objects of our world have an intrinsic three dimensional (3D) natures, 3D data modeling and 3D data management have so far been neglected in spatial database systems and Geographical Information Systems, which map geometric data mainly to two-dimensional abstractions. But increasingly the third dimension becomes more and more relevant for application domains. Large volumes of 3D data require a treatment in a database context for representing, querying, and manipulating them efficiently. After detailed researching on the mechanism of Modeling of the Geology Body, a new compositive data model is brought forward based on the joining of set theory (for short ST) and Unparallel Analogical Triangular Prisms (for short UATP). Spatial entity is decomposed into five fundamental kinds of data types in this model, including 3D point (3DP), 3D line (3DL), 3D sample surface (3DSS), 3D surface (3DS), and 3D volume (3DV). Meanwhile, nine data structures concerned are put forward, including node, TIN edge, side edge, arc edge, TIN surface, sample surface, quadrangle, UATP, and 3DSVC. Based on this, system of modeling and simulation for spatial entity are designed. Fault and mining roadway are presented as examples. This paper aims at investigating the complex inherent features of 3D data and presents an abstract, formal data model called ST (Set Theory). The data model comprises a set of three-dimensional spatial data types together with a collection of geometric set operations. The result shows that the data model based on set theory and UATP can improve speed and accuracy degree during process modeling. So, the main point in this paper is reconstruction of 3D Geological models, other question, such as: topological relations, data volumes as a key question for further study.

  10. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

    PubMed Central

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  11. Genetic-program-based data mining for hybrid decision-theoretic algorithms and theories

    NASA Astrophysics Data System (ADS)

    Smith, James F., III

    2005-03-01

    A genetic program (GP) based data mining (DM) procedure has been developed that automatically creates decision theoretic algorithms. A GP is an algorithm that uses the theory of evolution to automatically evolve other computer programs or mathematical expressions. The output of the GP is a computer program or mathematical expression that is optimal in the sense that it maximizes a fitness function. The decision theoretic algorithms created by the DM algorithm are typically designed for making real-time decisions about the behavior of systems. The database that is mined by the DM typically consists of many scenarios characterized by sensor output and labeled by experts as to the status of the scenario. The DM procedure will call a GP as a data mining function. The GP incorporates the database and expert"s rules into its fitness function to evolve an optimal decision theoretic algorithm. A decision theoretic algorithm created through this process will be discussed as well as validation efforts showing the utility of the decision theoretic algorithm created by the DM process. GP based data mining to determine equations related to scientific theories and automatic simplification methods based on computer algebra will also be discussed.

  12. Allport's Intergroup Contact Theory as a theoretical base for impacting student attitudes in interprofessional education.

    PubMed

    Bridges, Diane R; Tomkowiak, John

    2010-01-01

    Interprofessional education has been defined as "members or students of two or more professionals associated with health or social care, engaged in learning with, from and about each other". Ideally, students trained using interprofessional education paradigms become interprofessional team members who gain respect and improve their attitudes about each other, and ultimately improve patient outcomes. However, it has been stated that before interprofessional education can claim its importance and successes, its impact must be critically evaluated. What theory can explain the impact that interprofessional education seems to have on changing students' attitudes of other professionals and positively affecting their performance as interprofessional healthcare team members? The authors of this paper suggest conditions identified in Gordon Allport's Contact Theory may be used as a theoretical base in interprofessional education to positively impact attitudinal change of students towards working as an interprofessional team member. For the purpose of this paper, equal status and common goals will be the two conditions highlighted as a theoretical base in interprofessional education. The premise to be explored in this paper is that utilizing a sound theoretical base in interprofessional education may positively impact students' attitudes towards working in interprofessional teams. PMID:20216998

  13. A feature extraction method based on information theory for fault diagnosis of reciprocating machinery.

    PubMed

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  14. color prediction for print based on kubelka-munk theory and under ink penetration

    NASA Astrophysics Data System (ADS)

    Shi, Guoyun; Dong, Na; Zhang, Yixin

    2009-01-01

    In 1931, Kubelka and Munk introduced two-flux Kubelka-Munk model based on radiative transfer theory[1]. But the model has lots of strict conditions. In 1942, Saunderson introduced revised method, for there would occur multiple inside reflection in the interface of print, and it would induce density increase in the ink layer, and influenced prediction precision. Dot gain is always the difficult problem in print, it includes physics gain and optical gain. Kubelka-Munk model didn't consider dot gain's effect, especially optical gain. There are many methods to calculate dot gain. Many of them are based on point spread function principle[5]. Recently, Yang Li corrects the scattering coefficient S and absorption coefficient K in the Kubelka-Munk model based on statistical physics theory[2][3][4]. This makes the model has more widely applications. This article, taking into account of oil layer, oil penetration layer and paper layer respectively, thinking over multiple reflection and optical dot gain, builds a new halftone color prediction model.

  15. A simple laminate theory using the orthotropic viscoplasticity theory based on overstress. I - In-plane stress-strain relationships for metal matrix composites

    NASA Technical Reports Server (NTRS)

    Krempl, Erhard; Hong, Bor Zen

    1989-01-01

    A macromechanics analysis is presented for the in-plane, anisotropic time-dependent behavior of metal matrix laminates. The small deformation, orthotropic viscoplasticity theory based on overstress represents lamina behavior in a modified simple laminate theory. Material functions and constants can be identified in principle from experiments with laminae. Orthotropic invariants can be repositories for tension-compression asymmetry and for linear elasticity in one direction while the other directions behave in a viscoplastic manner. Computer programs are generated and tested for either unidirectional or symmetric laminates under in-plane loading. Correlations with the experimental results on metal matrix composites are presented.

  16. Superior coexistence: systematicALLY regulatING land subsidence BASED on set pair theory

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Gong, S.-L.

    2015-11-01

    Anthropogenic land subsidence is an environmental side effect of exploring and using natural resources in the process of economic development. The key points of the system for controlling land subsidence include cooperation and superior coexistence while the economy develops, exploring and using natural resources, and geological environmental safety. Using the theory and method of set pair analysis (SPA), this article anatomises the factors, effects, and transformation of land subsidence. Based on the principle of superior coexistence, this paper promotes a technical approach to the system for controlling land subsidence, in order to improve the prevention and control of geological hazards.

  17. Evaluation of a preschool nutrition education program based on the theory of multiple intelligences.

    PubMed

    Cason, K L

    2001-01-01

    This report describes the evaluation of a preschool nutrition education program based on the theory of multiple intelligences. Forty-six nutrition educators provided a series of 12 lessons to 6102 preschool-age children. The program was evaluated using a pretest/post-test design to assess differences in fruit and vegetable identification, healthy snack choices, willingness to taste foods, and eating behaviors. Subjects showed significant improvement in food identification and recognition, healthy snack identification, willingness to taste foods, and frequency of fruit, vegetable, meat, and dairy consumption. The evaluation indicates that the program was an effective approach for educating preschool children about nutrition. PMID:11953232

  18. A description of the mechanical behavior of composite solid propellants based on molecular theory

    NASA Technical Reports Server (NTRS)

    Landel, R. F.

    1976-01-01

    Both the investigation and the representation of the stress-strain response (including rupture) of gum and filled elastomers can be based on a simple functional statement. Internally consistent experiments are used to sort out the effects of time, temperature, strain and crosslink density on gum rubbers. All effects are readily correlated and shown to be essentially independent of the elastomer when considered in terms of non-dimensionalized stress, strain and time. A semiquantitative molecular theory is developed to explain this result. The introduction of fillers modifies the response, but, guided by the framework thus provided, their effects can be readily accounted for.

  19. Unique laminar-flow stability limit based shallow-water theory

    USGS Publications Warehouse

    Chen, Cheng-lung

    1993-01-01

    Two approaches are generally taken in deriving the stability limit for the Froude member (Fs) for laminar sheet flow. The first approach used the Orr-Sommerfeld equation, while the second uses the cross-section-averaged equations of continuity and motion. Because both approaches are based on shallow-water theory, the values of Fs obtained from both approaches should be identical, yet in the literature they are not. This suggests that a defect exists in at least one of the two approaches. After examining the governing equations used in both approaches, one finds that the existing cross-section -averaged equation of motion is dependent on the frame of reference.

  20. An ISAR imaging algorithm for the space satellite based on empirical mode decomposition theory

    NASA Astrophysics Data System (ADS)

    Zhao, Tao; Dong, Chun-zhu

    2014-11-01

    Currently, high resolution imaging of the space satellite is a popular topic in the field of radar technology. In contrast with regular targets, the satellite target often moves along with its trajectory and simultaneously its solar panel substrate changes the direction toward the sun to obtain energy. Aiming at the imaging problem, a signal separating and imaging approach based on the empirical mode decomposition (EMD) theory is proposed, and the approach can realize separating the signal of two parts in the satellite target, the main body and the solar panel substrate and imaging for the target. The simulation experimentation can demonstrate the validity of the proposed method.

  1. The Experimental Research on E-Learning Instructional Design Model Based on Cognitive Flexibility Theory

    NASA Astrophysics Data System (ADS)

    Cao, Xianzhong; Wang, Feng; Zheng, Zhongmei

    The paper reports an educational experiment on the e-Learning instructional design model based on Cognitive Flexibility Theory, the experiment were made to explore the feasibility and effectiveness of the model in promoting the learning quality in ill-structured domain. The study performed the experiment on two groups of students: one group learned through the system designed by the model and the other learned by the traditional method. The results of the experiment indicate that the e-Learning designed through the model is helpful to promote the intrinsic motivation, learning quality in ill-structured domains, ability to resolve ill-structured problem and creative thinking ability of the students.

  2. A classification framework of online learning activities: based on grounded theory

    NASA Astrophysics Data System (ADS)

    Zhan, Zehui

    2011-12-01

    The purpose of this paper is to set up a classification framework of online learning activities. Fifty-nine online learning activity cases were collected from seven disciplines. Open coding, axial coding, and selective coding were conducted according to Grounded Theory. After step-by-step validation, the classification framework consists of six core categories (Argumentation, Resource Sharing, Collaboration, Inquiry, Evaluation, and Social Network) has been set up. Further study is needed to get more insight into each category and establish effective activity-based instruction models.

  3. Design optical antenna and fiber coupling system based on the vector theory of reflection and refraction.

    PubMed

    Jiang, Ping; Yang, Huajun; Mao, Shengqian

    2015-10-01

    A Cassegrain antenna system and an optical fiber coupling system which consists of a plano-concave lens and a plano-convex lens are designed based on the vector theory of reflection and refraction, so as to improve the transmission performance of the optical antenna and fiber coupling system. Three-dimensional ray tracing simulation are performed and results of the optical aberrations calculation and the experimental test show that the aberrations caused by on-axial defocusing, off-axial defocusing and deflection of receiving antenna can be well corrected by the optical fiber coupling system. PMID:26480125

  4. The Study of Relationship and Strategy Between New Energy and Economic Development Based on Decoupling Theory

    NASA Astrophysics Data System (ADS)

    Liu, Jun; Xu, Hui; Liu, Yaping; Xu, Yang

    With the increasing pressure in energy conservation and emissions reduction, the new energy revolution in China is imminent. The implementation of electric energy substitution and cleaner alternatives is an important way to resolve the contradiction among economic growth, energy saving and emission reduction. This article demonstrates that China is in the second stage which energy consumption and GDP is increasing together with the reducing of energy consumption intensity based on the theory of decoupling. At the same time, new energy revolution needs to be realized through the increasing of the carbon productivity and the proportion of new energy.

  5. Kinetic theory based wave-particle splitting scheme for Euler equations

    NASA Astrophysics Data System (ADS)

    Rao, S. V. R.; Deshpande, S. M.

    1992-11-01

    A new upwind wave-particle splitting scheme is developed, based on the connection between the kinetic theory of gased and Euler equations and using the concept of thermal velocity. The new upwind method is applied to the standard one-dimensional shock tube problem and to the problem of two-dimensional shock reflection from a flat plate. Results for the two-dimensional problem showed that the new scheme is much less dissipative than the kinetic flux vector splitting scheme of Deshpande (1986) and Mandal (1989).

  6. Density functional theory based simulations of silicon nanowire field effect transistors

    NASA Astrophysics Data System (ADS)

    Shin, Mincheol; Jeong, Woo Jin; Lee, Jaehyun

    2016-04-01

    First-principles density functional theory (DFT) based, atomistic, self-consistent device simulations are performed for realistically sized Si nanowire field effect transistors (NW FETs) having tens of thousands of atoms. Through mode space transformation, DFT Hamiltonian and overlap matrices are reduced in size from a few thousands to around one hundred. Ultra-efficient quantum-mechanical transport calculations in the non-equilibrium Green's function formalism in a non-orthogonal basis are therefore made possible. The n-type and p-type Si NW FETs are simulated and found to exhibit similar device performance in the nanoscale regime.

  7. Constraints on Neutron Star Radii Based on Chiral Effective Field Theory Interactions

    SciTech Connect

    Hebeler, K.; Lattimer, J. M.; Pethick, C. J.; Schwenk, A.

    2010-10-15

    We show that microscopic calculations based on chiral effective field theory interactions constrain the properties of neutron-rich matter below nuclear densities to a much higher degree than is reflected in commonly used equations of state. Combined with observed neutron star masses, our results lead to a radius R=9.7-13.9 km for a 1.4M{sub {center_dot}} star, where the theoretical range is due, in about equal amounts, to uncertainties in many-body forces and to the extrapolation to high densities.

  8. Development of new tip-loss corrections based on vortex theory and vortex methods

    NASA Astrophysics Data System (ADS)

    Branlard, Emmanuel; Gaunaa, Mac

    2014-12-01

    A new analytical formulation of the tip-loss factor is established based on helical vortex filament solutions. The derived tip-loss factor can be applied to wind-turbines, propellers or other rotary wings. Similar numerical formulations are used to assess the influence of wake expansion on tip-losses. Theodorsen's theory is successfully applied for the first time to assess the wake expansion behind a wind turbine. The tip-loss corrections obtained are compared with the ones from Prandtl and Glauert and implemented within a new Blade Element Momentum(BEM) code. Wake expansion is seen to reduce tip-losses and have a greater influence than wake distortion.

  9. The Stability Analysis for an Extended Car Following Model Based on Control Theory

    NASA Astrophysics Data System (ADS)

    Ge, Hong-Xia; Meng, Xiang-Pei; Zhu, Ke-Qiang; Cheng, Rong-Jun

    2014-08-01

    A new method is proposed to study the stability of the car-following model considering traffic interruption probability. The stability condition for the extended car-following model is obtained by using the Lyapunov function and the condition for no traffic jam is also given based on the control theory. Numerical simulations are conducted to demonstrate and verify the analytical results. Moreover, numerical simulations show that the traffic interruption probability has an influence on driving behavior and confirm the effectiveness of the method on the stability of traffic flow.

  10. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  11. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  12. Optimal control of ICU patient discharge: from theory to implementation.

    PubMed

    Mallor, Fermín; Azcárate, Cristina; Barado, Julio

    2015-09-01

    This paper deals with the management of scarce health care resources. We consider a control problem in which the objective is to minimize the rate of patient rejection due to service saturation. The scope of decisions is limited, in terms both of the amount of resources to be used, which are supposed to be fixed, and of the patient arrival pattern, which is assumed to be uncontrollable. This means that the only potential areas of control are speed or completeness of service. By means of queuing theory and optimization techniques, we provide a theoretical solution expressed in terms of service rates. In order to make this theoretical analysis useful for the effective control of the healthcare system, however, further steps in the analysis of the solution are required: physicians need flexible and medically-meaningful operative rules for shortening patient length of service to the degree needed to give the service rates dictated by the theoretical analysis. The main contribution of this paper is to discuss how the theoretical solutions can be transformed into effective management rules to guide doctors' decisions. The study examines three types of rules based on intuitive interpretations of the theoretical solution. Rules are evaluated through implementation in a simulation model. We compare the service rates provided by the different policies with those dictated by the theoretical solution. Probabilistic analysis is also included to support rule validity. An Intensive Care Unit is used to illustrate this control problem. The study focuses on the Markovian case before moving on to consider more realistic LoS distributions (Weibull, Lognormal and Phase-type distribution). PMID:25763761

  13. Einstein Critical-Slowing-Down is Siegel CyberWar Denial-of-Access Queuing/Pinning/ Jamming/Aikido Via Siegel DIGIT-Physics BEC ``Intersection''-BECOME-UNION Barabasi Network/GRAPH-Physics BEC: Strutt/Rayleigh-Siegel Percolation GLOBALITY-to-LOCALITY Phase-Transition Critical-Phenomenon

    NASA Astrophysics Data System (ADS)

    Buick, Otto; Falcon, Pat; Alexander, G.; Siegel, Edward Carl-Ludwig

    2013-03-01

    Einstein[Dover(03)] critical-slowing-down(CSD)[Pais, Subtle in The Lord; Life & Sci. of Albert Einstein(81)] is Siegel CyberWar denial-of-access(DOA) operations-research queuing theory/pinning/jamming/.../Read [Aikido, Aikibojitsu & Natural-Law(90)]/Aikido(!!!) phase-transition critical-phenomenon via Siegel DIGIT-Physics (Newcomb[Am.J.Math. 4,39(1881)]-{Planck[(1901)]-Einstein[(1905)])-Poincare[Calcul Probabilités(12)-p.313]-Weyl [Goett.Nachr.(14); Math.Ann.77,313 (16)]-{Bose[(24)-Einstein[(25)]-Fermi[(27)]-Dirac[(1927)]}-``Benford''[Proc.Am.Phil.Soc. 78,4,551 (38)]-Kac[Maths.Stat.-Reasoning(55)]-Raimi[Sci.Am. 221,109 (69)...]-Jech[preprint, PSU(95)]-Hill[Proc.AMS 123,3,887(95)]-Browne[NYT(8/98)]-Antonoff-Smith-Siegel[AMS Joint-Mtg.,S.-D.(02)] algebraic-inversion to yield ONLY BOSE-EINSTEIN QUANTUM-statistics (BEQS) with ZERO-digit Bose-Einstein CONDENSATION(BEC) ``INTERSECTION''-BECOME-UNION to Barabasi[PRL 876,5632(01); Rev.Mod.Phys.74,47(02)...] Network /Net/GRAPH(!!!)-physics BEC: Strutt/Rayleigh(1881)-Polya(21)-``Anderson''(58)-Siegel[J.Non-crystalline-Sol.40,453(80)

  14. A Monte Carlo exploration of threefold base geometries for 4d F-theory vacua

    NASA Astrophysics Data System (ADS)

    Taylor, Washington; Wang, Yi-Nan

    2016-01-01

    We use Monte Carlo methods to explore the set of toric threefold bases that support elliptic Calabi-Yau fourfolds for F-theory compactifications to four dimensions, and study the distribution of geometrically non-Higgsable gauge groups, matter, and quiver structure. We estimate the number of distinct threefold bases in the connected set studied to be 1048. The distribution of bases peaks around h 1,1 82. All bases encountered after "thermalization" have some geometric non-Higgsable structure. We find that the number of non-Higgsable gauge group factors grows roughly linearly in h 1,1 of the threefold base. Typical bases have 6 isolated gauge factors as well as several larger connected clusters of gauge factors with jointly charged matter. Approximately 76% of the bases sampled contain connected two-factor gauge group products of the form SU(3) SU(2), which may act as the non-Abelian part of the standard model gauge group. SU(3) SU(2) is the third most common connected two-factor product group, following SU(2) SU(2) and G 2 SU(2), which arise more frequently.

  15. Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain

    PubMed Central

    Brette, Romain

    2015-01-01

    Does the brain use a firing rate code or a spike timing code? Considering this controversial question from an epistemological perspective, I argue that progress has been hampered by its problematic phrasing. It takes the perspective of an external observer looking at whether those two observables vary with stimuli, and thereby misses the relevant question: which one has a causal role in neural activity? When rephrased in a more meaningful way, the rate-based view appears as an ad hoc methodological postulate, one that is practical but with virtually no empirical or theoretical support. PMID:26617496

  16. A theory-based evaluation of a community-based funding scheme in a disadvantaged suburban city area.

    PubMed

    Hickey, Gráinne; McGilloway, Sinead; O'Brien, Morgan; Leckey, Yvonne; Devlin, Maurice

    2015-10-01

    Community-driven development (CDD) initiatives frequently involve funding schemes which are aimed at channelling financial investment into local need and fostering community participation and engagement. This exploratory study examined, through a program theory approach, the design and implementation of a small-scale, community-based fund in Ireland. Observations, documentary analysis, interviews and group discussions with 19 participants were utilized to develop a detailed understanding of the program mechanisms, activities and processes, as well as the experiences of key stakeholders engaged with the funding scheme and its implementation. The findings showed that there were positive perceptions of the scheme and its function within the community. Overall, the availability of funding was perceived by key stakeholders as being beneficial. However, there were concerns over the accessibility of the scheme for more marginalized members of the community, as well as dissatisfaction with the openness and transparency surrounding funding eligibility. Lessons for the implementation of small-scale CDD funds are elaborated and the utility of program theory approaches for evaluators and planners working with programs that fund community-based initiatives is outlined. PMID:25933408

  17. A curvature-dependent interfacial energy-based interface stress theory and its applications to nano-structured materials: (I) General theory

    NASA Astrophysics Data System (ADS)

    Gao, Xiang; Huang, Zhuping; Qu, Jianmin; Fang, Daining

    2014-05-01

    Experimental observations have shown the size-dependent residual surface stresses on spherical nanoparticles and their influence on the effective modulus of heterogeneous nanostructures. Based on these experimental findings, this paper proposes a new interface stress theory that considers the curvature effect on the interfacial energy. To investigate this curvature-dependent interfacial energy, we use the Green elasticity theory to describe the nonlinear constitutive relation of the interface at finite deformation, thus explicitly demonstrating the curvature-dependent nature of the interface stress and bending moment. By introducing a fictitious stress-free configuration, we then propose a new energy functional for heterogeneous hyperelastic solids with interfaces. For the first time, both the Lagrangian and Eulerian descriptions of the generalized Young-Laplace equation, which describes the intrinsic flexural resistance of the interface, are derived from the newly developed energy functional. This new interface stress theory is then used to investigate the residual elastic field in a heterogeneous hyperelastic solid containing interfaces. The present theory differs from the existing theories in that it takes fully into account both the curvature-dependence of the interfacial energy and the interfacial energy-induced residual elastic field in the bulk solid. Furthermore, the fundamental equations of the interface are given in Lagrangian description, which are preferable when considering the effects of residual interface stress, residual interface bending moment and interface elasticity. Finally, two examples are presented to shed light on the significance of this new interface stress theory. A more detailed analysis and applications of the new theory will be presented in Part (II) of this paper.

  18. Quorum-Sensing Synchronization of Synthetic Toggle Switches: A Design Based on Monotone Dynamical Systems Theory

    PubMed Central

    Nikolaev, Evgeni V.

    2016-01-01

    Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of “on-off” switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular (“intrinsic”) or environmental (“extrinsic”) noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a “majority-vote” correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of “monotone” dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable (“chaotic”) behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation analysis, performed for a biochemically-detailed and biologically-relevant model that we developed. Another novel feature of our approach is that our theorems on exponential stability of steady states for homogeneous or mixed populations are valid independently of the number N of cells in the population, which is usually very large (N ≫ 1) and unknown. We prove that the exponential stability depends on relative proportions of each type of state only. While monotone systems theory has been used previously for systems biology analysis, the current work illustrates its power for synthetic biology design, and thus has wider significance well beyond the application to the important problem of coordination of toggle switches. PMID:27128344

  19. Quorum-Sensing Synchronization of Synthetic Toggle Switches: A Design Based on Monotone Dynamical Systems Theory.

    PubMed

    Nikolaev, Evgeni V; Sontag, Eduardo D

    2016-04-01

    Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of "on-off" switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular ("intrinsic") or environmental ("extrinsic") noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a "majority-vote" correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of "monotone" dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable ("chaotic") behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation analysis, performed for a biochemically-detailed and biologically-relevant model that we developed. Another novel feature of our approach is that our theorems on exponential stability of steady states for homogeneous or mixed populations are valid independently of the number N of cells in the population, which is usually very large (N ≫ 1) and unknown. We prove that the exponential stability depends on relative proportions of each type of state only. While monotone systems theory has been used previously for systems biology analysis, the current work illustrates its power for synthetic biology design, and thus has wider significance well beyond the application to the important problem of coordination of toggle switches. PMID:27128344

  20. A general theory of evolution based on energy efficiency: its implications for diseases.

    PubMed

    Yun, Anthony J; Lee, Patrick Y; Doux, John D; Conley, Buford R

    2006-01-01

    We propose a general theory of evolution based on energy efficiency. Life represents an emergent property of energy. The earth receives energy from cosmic sources such as the sun. Biologic life can be characterized by the conversion of available energy into complex systems. Direct energy converters such as photosynthetic microorganisms and plants transform light energy into high-energy phosphate bonds that fuel biochemical work. Indirect converters such as herbivores and carnivores predominantly feed off the food chain supplied by these direct converters. Improving energy efficiency confers competitive advantage in the contest among organisms for energy. We introduce a term, return on energy (ROE), as a measure of energy efficiency. We define ROE as a ratio of the amount of energy acquired by a system to the amount of energy consumed to generate that gain. Life-death cycling represents a tactic to sample the environment for innovations that allow increases in ROE to develop over generations rather than an individual lifespan. However, the variation-selection strategem of Darwinian evolution may define a particular tactic rather than an overarching biological paradigm. A theory of evolution based on competition for energy and driven by improvements in ROE both encompasses prior notions of evolution and portends post-Darwinian mechanisms. Such processes may involve the exchange of non-genetic traits that improve ROE, as exemplified by cognitive adaptations or memes. Under these circumstances, indefinite persistence may become favored over life-death cycling, as increases in ROE may then occur more efficiently within a single lifespan rather than over multiple generations. The key to this transition may involve novel methods to address the promotion of health and cognitive plasticity. We describe the implications of this theory for human diseases. PMID:16122878

  1. Theory of chemical kinetics and charge transfer based on nonequilibrium thermodynamics.

    PubMed

    Bazant, Martin Z

    2013-05-21

    Advances in the fields of catalysis and electrochemical energy conversion often involve nanoparticles, which can have kinetics surprisingly different from the bulk material. Classical theories of chemical kinetics assume independent reactions in dilute solutions, whose rates are determined by mean concentrations. In condensed matter, strong interactions alter chemical activities and create variations that can dramatically affect the reaction rate. The extreme case is that of a reaction coupled to a phase transformation, whose kinetics must depend not only on the order parameter but also on its gradients at phase boundaries. Reaction-driven phase transformations are common in electrochemistry, when charge transfer is accompanied by ion intercalation or deposition in a solid phase. Examples abound in Li-ion, metal-air, and lead-acid batteries, as well as metal electrodeposition-dissolution. Despite complex thermodynamics, however, the standard kinetic model is the Butler-Volmer equation, based on a dilute solution approximation. The Marcus theory of charge transfer likewise considers isolated reactants and neglects elastic stress, configurational entropy, and other nonidealities in condensed phases. The limitations of existing theories recently became apparent for the Li-ion battery material LixFePO4 (LFP). It has a strong tendency to separate into Li-rich and Li-poor solid phases, which scientists believe limits its performance. Chemists first modeled phase separation in LFP as an isotropic "shrinking core" within each particle, but experiments later revealed striped phase boundaries on the active crystal facet. This raised the question: What is the reaction rate at a surface undergoing a phase transformation? Meanwhile, dramatic rate enhancement was attained with LFP nanoparticles, and classical battery models could not predict the roles of phase separation and surface modification. In this Account, I present a general theory of chemical kinetics, developed over the past 7 years, which is capable of answering these questions. The reaction rate is a nonlinear function of the thermodynamic driving force, the free energy of reaction, expressed in terms of variational chemical potentials. The theory unifies and extends the Cahn-Hilliard and Allen-Cahn equations through a master equation for nonequilibrium chemical thermodynamics. For electrochemistry, I have also generalized both Marcus and Butler-Volmer kinetics for concentrated solutions and ionic solids. This new theory provides a quantitative description of LFP phase behavior. Concentration gradients and elastic coherency strain enhance the intercalation rate. At low currents, the charge-transfer rate is focused on exposed phase boundaries, which propagate as "intercalation waves", nucleated by surface wetting. Unexpectedly, homogeneous reactions are favored above a critical current and below a critical size, which helps to explain the rate capability of LFP nanoparticles. Contrary to other mechanisms, elevated temperatures and currents may enhance battery performance and lifetime by suppressing phase separation. The theory has also been extended to porous electrodes and could be used for battery engineering with multiphase active materials. More broadly, the theory describes nonequilibrium chemical systems at mesoscopic length and time scales, beyond the reach of molecular simulations and bulk continuum models. The reaction rate is consistently defined for inhomogeneous, nonequilibrium states, for example, with phase separation, large electric fields, or mechanical stresses. This research is also potentially applicable to fluid extraction from nanoporous solids, pattern formation in electrophoretic deposition, and electrochemical dynamics in biological cells. PMID:23520980

  2. Statistical synthesis of contextual knowledge to increase the effectiveness of theory-based behaviour change interventions.

    PubMed

    Hanbury, Andria; Thompson, Carl; Mannion, Russell

    2011-07-01

    Tailored implementation strategies targeting health professionals' adoption of evidence-based recommendations are currently being developed. Research has focused on how to select an appropriate theoretical base, how to use that theoretical base to explore the local context, and how to translate theoretical constructs associated with the key factors found to influence innovation adoption into feasible and tailored implementation strategies. The reasons why an intervention is thought not to have worked are often cited as being: inappropriate choice of theoretical base; unsystematic development of the implementation strategies; and a poor evidence base to guide the process. One area of implementation research that is commonly overlooked is how to synthesize the data collected in a local context in order to identify what factors to target with the implementation strategies. This is suggested to be a critical process in the development of a theory-based intervention. The potential of multilevel modelling techniques to synthesize data collected at different hierarchical levels, for example, individual attitudes and team level variables, is discussed. Future research is needed to explore further the potential of multilevel modelling for synthesizing contextual data in implementation studies, as well as techniques for synthesizing qualitative and quantitative data. PMID:21543383

  3. Dissemination of a theory-based online bone health program: Two intervention approaches.

    PubMed

    Nahm, Eun-Shim; Resnick, Barbara; Bellantoni, Michele; Zhu, Shijun; Brown, Clayton; Brennan, Patricia F; Charters, Kathleen; Brown, Jeanine; Rietschel, Matthew; Pinna, Joanne; An, Minjeong; Park, Bu Kyung; Plummer, Lisa

    2015-06-01

    With the increasing nationwide emphasis on eHealth, there has been a rapid growth in the use of the Internet to deliver health promotion interventions. Although there has been a great deal of research in this field, little information is available regarding the methodologies to develop and implement effective online interventions. This article describes two social cognitive theory-based online health behavior interventions used in a large-scale dissemination study (N = 866), their implementation processes, and the lessons learned during the implementation processes. The two interventions were a short-term (8-week) intensive online Bone Power program and a longer term (12-month) Bone Power Plus program, including the Bone Power program followed by a 10-month online booster intervention (biweekly eHealth newsletters). This study used a small-group approach (32 intervention groups), and to effectively manage those groups, an eLearning management program was used as an upper layer of the Web intervention. Both interventions were implemented successfully with high retention rates (80.7% at 18 months). The theory-based approaches and the online infrastructure used in this study showed a promising potential as an effective platform for online behavior studies. Further replication studies with different samples and settings are needed to validate the utility of this intervention structure. PMID:26021668

  4. Adapting evidence-based interventions using a common theory, practices, and principles.

    PubMed

    Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D

    2014-01-01

    Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed. PMID:24079747

  5. Looking to the future of new media in health marketing: deriving propositions based on traditional theories.

    PubMed

    Della, Lindsay J; Eroglu, Dogan; Bernhardt, Jay M; Edgerton, Erin; Nall, Janice

    2008-01-01

    Market trend data show that the media marketplace continues to rapidly evolve. Recent research shows that substantial portions of the U.S. media population are "new media" users. Today, more than ever before, media consumers are exposed to multiple media at the same point in time, encouraged to participate in media content generation, and challenged to learn, access, and use the new media that are continually entering the market. These media trends have strong implications for how consumers of health information access, process, and retain health-related knowledge. In this article we review traditional information processing models and theories of interpersonal and mass media access and consumption. We make several theory-based propositions for how traditional information processing and media consumption concepts will function as new media usage continues to increase. These propositions are supported by new media usage data from the Centers for Disease Control and Prevention's entry into the new media market (e.g., podcasting, virtual events, blogging, and webinars). Based on these propositions, we conclude by presenting both opportunities and challenges that public health communicators and marketers will face in the future. PMID:18935883

  6. Coding theory based models for protein translation initiation in prokaryotic organisms.

    SciTech Connect

    May, Elebeoba Eni; Bitzer, Donald L. (North Carolina State University, Raleigh, NC); Rosnick, David I. (North Carolina State University, Raleigh, NC); Vouk, Mladen A.

    2003-03-01

    Our research explores the feasibility of using communication theory, error control (EC) coding theory specifically, for quantitatively modeling the protein translation initiation mechanism. The messenger RNA (mRNA) of Escherichia coli K-12 is modeled as a noisy (errored), encoded signal and the ribosome as a minimum Hamming distance decoder, where the 16S ribosomal RNA (rRNA) serves as a template for generating a set of valid codewords (the codebook). We tested the E. coli based coding models on 5' untranslated leader sequences of prokaryotic organisms of varying taxonomical relation to E. coli including: Salmonella typhimurium LT2, Bacillus subtilis, and Staphylococcus aureus Mu50. The model identified regions on the 5' untranslated leader where the minimum Hamming distance values of translated mRNA sub-sequences and non-translated genomic sequences differ the most. These regions correspond to the Shine-Dalgarno domain and the non-random domain. Applying the EC coding-based models to B. subtilis, and S. aureus Mu50 yielded results similar to those for E. coli K-12. Contrary to our expectations, the behavior of S. typhimurium LT2, the more taxonomically related to E. coli, resembled that of the non-translated sequence group.

  7. Massive Yang-Mills theory based on the nonlinearly realized gauge group

    SciTech Connect

    Bettinelli, D.; Ferrari, R.; Quadri, A.

    2008-02-15

    We propose a subtraction scheme for a massive Yang-Mills theory realized via a nonlinear representation of the gauge group [here SU(2)]. It is based on the subtraction of the poles in D-4 of the amplitudes, in dimensional regularization, after a suitable normalization has been performed. Perturbation theory is in the number of loops, and the procedure is stable under iterative subtraction of the poles. The unphysical Goldstone bosons, the Faddeev-Popov ghosts, and the unphysical mode of the gauge field are expected to cancel out in the unitarity equation. The spontaneous symmetry breaking parameter is not a physical variable. We use the tools already tested in the nonlinear sigma model: hierarchy in the number of Goldstone boson legs and weak-power-counting property (finite number of independent divergent amplitudes at each order). It is intriguing that the model is naturally based on the symmetry SU(2){sub L} local x SU(2){sub R} global. By construction the physical amplitudes depend on the mass and on the self-coupling constant of the gauge particle and moreover on the scale parameter of the radiative corrections. The Feynman rules are in the Landau gauge.

  8. Improving breast cancer control among Latinas: evaluation of a theory-based educational program.

    PubMed

    Mishra, S I; Chavez, L R; Magaa, J R; Nava, P; Burciaga Valdez, R; Hubbell, F A

    1998-10-01

    The study evaluated a theory-based breast cancer control program specially developed for less acculturated Latinas. The authors used a quasi-experimental design with random assignment of Latinas into experimental (n = 51) or control (n = 37) groups that completed one pretest and two posttest surveys. The experimental group received the educational program, which was based on Bandura's self-efficacy theory and Freire's empowerment pedagogy. Outcome measures included knowledge, perceived self-efficacy, attitudes, breast self-examination (BSE) skills, and mammogram use. At posttest 1, controlling for pretest scores, the experimental group was significantly more likely than the control group to have more medically recognized knowledge (sum of square [SS] = 17.0, F = 6.58, p < .01), have less medically recognized knowledge (SS = 128.8, F = 39.24, p < .001), greater sense of perceived self-efficacy (SS = 316.5, F = 9.63, p < .01), and greater adeptness in the conduct of BSE (SS = 234.8, F = 153.33, p < .001). Cancer control programs designed for less acculturated women should use informal and interactive educational methods that incorporate skill-enhancing and empowering techniques. PMID:9768384

  9. Credibility theory based dynamic control bound optimization for reservoir flood limited water level

    NASA Astrophysics Data System (ADS)

    Jiang, Zhiqiang; Sun, Ping; Ji, Changming; Zhou, Jianzhong

    2015-10-01

    The dynamic control operation of reservoir flood limited water level (FLWL) can solve the contradictions between reservoir flood control and beneficial operation well, and it is an important measure to make sure the security of flood control and realize the flood utilization. The dynamic control bound of FLWL is a fundamental key element for implementing reservoir dynamic control operation. In order to optimize the dynamic control bound of FLWL by considering flood forecasting error, this paper took the forecasting error as a fuzzy variable, and described it with the emerging credibility theory in recent years. By combining the flood forecasting error quantitative model, a credibility-based fuzzy chance constrained model used to optimize the dynamic control bound was proposed in this paper, and fuzzy simulation technology was used to solve the model. The FENGTAN reservoir in China was selected as a case study, and the results show that, compared with the original operation water level, the initial operation water level (IOWL) of FENGTAN reservoir can be raised 4 m, 2 m and 5.5 m respectively in the three division stages of flood season, and without increasing flood control risk. In addition, the rationality and feasibility of the proposed forecasting error quantitative model and credibility-based dynamic control bound optimization model are verified by the calculation results of extreme risk theory.

  10. A Theory-Based Approach to Teaching Young Children about Health: A Recipe for Understanding

    ERIC Educational Resources Information Center

    Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley

    2011-01-01

    The theory-theory account of conceptual development posits that children's concepts are integrated into theories. Concept-learning studies have documented the central role that theories play in children's learning of experimenter-defined categories but have yet to extensively examine complex, real-world concepts, such as health. The present study…

  11. Combinatorial density functional theory-based screening of surface alloys for the oxygen reduction reaction.

    SciTech Connect

    Greeley, J.; Norskov, J.; Center for Nanoscale Materials; Technical Univ. of Denmark

    2009-03-26

    A density functional theory (DFT) -based, combinatorial search for improved oxygen reduction reaction (ORR) catalysts is presented. A descriptor-based approach to estimate the ORR activity of binary surface alloys, wherein alloying occurs only in the surface layer, is described, and rigorous, potential-dependent computational tests of the stability of these alloys in aqueous, acidic environments are presented. These activity and stability criteria are applied to a database of DFT calculations on nearly 750 binary transition metal surface alloys; of these, many are predicted to be active for the ORR but, with few exceptions, they are found to be thermodynamically unstable in the acidic environments typical of low-temperature fuel cells. The results suggest that, absent other thermodynamic or kinetic mechanisms to stabilize the alloys, surface alloys are unlikely to serve as useful ORR catalysts over extended periods of operation.

  12. Buckling Analysis for Stiffened Anisotropic Circular Cylinders Based on Sanders Nonlinear Shell Theory

    NASA Technical Reports Server (NTRS)

    Nemeth, Michael P.

    2014-01-01

    Nonlinear and bifurcation buckling equations for elastic, stiffened, geometrically perfect, right-circular cylindrical, anisotropic shells subjected to combined loads are presented that are based on Sanders' shell theory. Based on these equations, a three-parameter approximate Rayleigh-Ritz solution and a classical solution to the buckling problem are presented for cylinders with simply supported edges. Extensive comparisons of results obtained from these solutions with published results are also presented for a wide range of cylinder constructions. These comparisons include laminated-composite cylinders with a wide variety of shell-wall orthotropies and anisotropies. Numerous results are also given that show the discrepancies between the results obtained by using Donnell's equations and variants of Sanders' equations. For some cases, nondimensional parameters are identified and "master" curves are presented that facilitate the concise representation of results.

  13. Multi-source remote sensing image fusion classification based on DS evidence theory

    NASA Astrophysics Data System (ADS)

    Liu, Chunping; Ma, Xiaohu; Cui, Zhiming

    2007-11-01

    A new adaptive remote sensing image fusion classification based on the Dempster-Shafer theory of evidence is presented. This method uses a limited number of prototypes as items of evidence, which is automatically generated by modified Fuzzy Kohonen Clustering Network (FKCN). The class fuzzy membership of each prototype is also determined using reference pattern set. For each input vector a basic probability assignment (BPA) function are computed based on these distances and on the degree of membership of prototypes to each class. And lastly this evidence is combined using Dempster's rule. This proposed method can be implemented in a modified FKCN with specific architecture consisting of one input layer, a prototype layer, a BPA layer, a combination and output layer, and decision layer. The experimental results show that the excellent performance of classification as compared to existing FKCN and basic DS fusion techniques.

  14. Study on the salary system for IT enterprise based on double factor motivation theory

    NASA Astrophysics Data System (ADS)

    Zhuang, Chen; Qian, Wu

    2005-12-01

    To improve the fact that the IT enterprise's salary & compensation system can not motivate a company's staff efficiently, the salary system based on Hertzberg's double factor motivation theory and the enterprise characteristics is presented. The salary system includes a salary model, an assessment model and a performance model. The system is connected with a cash incentive based on the staff's performance and emphasizes that the salary alone is not a motivating factor. Health care, for example, may also play a positive role on the motivation factor. According to this system, a scientific and reasonable salary & compensation management system was established and applied in an IT enterprise. It was found to promote the enterprise's overall performance and competitive power.

  15. Geometrically nonlinear isogeometric analysis of laminated composite plates based on higher-order shear deformation theory

    NASA Astrophysics Data System (ADS)

    Tran, Loc V.; Lee, Jaehong; Nguyen-Van, H.; Nguyen-Xuan, H.; Wahab, M. Abdel

    2015-06-01

    In this paper, we present an effectively numerical approach based on isogeometric analysis (IGA) and higher-order shear deformation theory (HSDT) for geometrically nonlinear analysis of laminated composite plates. The HSDT allows us to approximate displacement field that ensures by itself the realistic shear strain energy part without shear correction factors. IGA utilizing basis functions namely B-splines or non-uniform rational B-splines (NURBS) enables to satisfy easily the stringent continuity requirement of the HSDT model without any additional variables. The nonlinearity of the plates is formed in the total Lagrange approach based on the von-Karman strain assumptions. Numerous numerical validations for the isotropic, orthotropic, cross-ply and angle-ply laminated plates are provided to demonstrate the effectiveness of the proposed method.

  16. Optimization of a photovoltaic pumping system based on the optimal control theory

    SciTech Connect

    Betka, A.; Attali, A.

    2010-07-15

    This paper suggests how an optimal operation of a photovoltaic pumping system based on an induction motor driving a centrifugal pump can be realized. The optimization problem consists in maximizing the daily pumped water quantity via the optimization of the motor efficiency for every operation point. The proposed structure allows at the same time the minimization the machine losses, the field oriented control and the maximum power tracking of the photovoltaic array. This will be attained based on multi-input and multi-output optimal regulator theory. The effectiveness of the proposed algorithm is described by simulation and the obtained results are compared to those of a system working with a constant air gap flux. (author)

  17. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    PubMed

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/. PMID:26313379

  18. Unit Template Synchronous Reference Frame Theory Based Control Algorithm for DSTATCOM

    NASA Astrophysics Data System (ADS)

    Bangarraju, J.; Rajagopal, V.; Jayalaxmi, A.

    2014-04-01

    This article proposes new and simplified unit templates instead of standard phase locked loop (PLL) for Synchronous Reference Frame Theory Control Algorithm (SRFT). The extraction of synchronizing components (sinθ and cosθ) for parks and inverse parks transformation using standard PLL takes more execution time. This execution time in control algorithm delays the extraction of reference source current generation. The standard PLL not only takes more execution time but also increases the reactive power burden on the Distributed Static Compensator (DSTATCOM). This work proposes a unit template based SRFT control algorithm for four-leg insulated gate bipolar transistor based voltage source converter for DSTATCOM in distribution systems. This will reduce the execution time and reactive power burden on the DSTATCOM. The proposed DSTATCOM suppress harmonics, regulates the terminal voltage along with neutral current compensation. The DSTATCOM in distribution systems with proposed control algorithm is modeled and simulated using MATLAB using SIMULINK and Simpower systems toolboxes.

  19. Re-Examining of Moffitt’s Theory of Delinquency through Agent Based Modeling

    PubMed Central

    Leaw, Jia Ning; Ang, Rebecca P.; Huan, Vivien S.; Chan, Wei Teng; Cheong, Siew Ann

    2015-01-01

    Moffitt’s theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

  20. Flexural sensitivity and resonance of cantilever micro-sensors based on nonlocal elasticity theory

    NASA Astrophysics Data System (ADS)

    Gheshlaghi, Behnam; Mirzaei, Yaser

    2012-06-01

    Sensors based on microcantilevers, especially ones with uniform structure, have ultrahigh sensitivities. The normalized natural frequencies and the sensitivity of lateral vibration of an elastic microcantilever sensor in contact with a surface are derived analytically based on the Euler-Bernoulli beam theory by taking into account the small scale effect. The interaction of the sensor with the surface is modeled by linear springs, which restricts the results to experiments involving low-amplitude excitations. The results show that the normalized natural frequencies of nonlocal microcantilever are smaller than those for its local counterpart, especially for higher values of small scale parameters. Also, each mode has a different sensitivity to variations in surface stiffness. Moreover, the most sensitivity is observed at the first mode of vibration. When the nonlocal effect is not taken into account, the natural frequencies and the sensitivity of the microcantilever in contact with the surface are compared with those obtained in previous study without considering the nonlocal effect.

  1. Conductance of three-terminal molecular bridge based on tight-binding theory

    NASA Astrophysics Data System (ADS)

    Wang, Li-Guang; Li, Yong; Yu, Ding-Wen; Katsunori, Tagami; Masaru, Tsukada

    2005-05-01

    The quantum transmission characteristic of three-benzene ring nano-molecular bridge is investigated theoretically by using Green's function approach based on tight-binding theory with only a π orbital per carbon atom at the site. The transmission probabilities that electrons transport through the molecular bridge from one terminal to the other two terminals are obtained. The electronic current distributions inside the molecular bridge are calculated and shown in graphical analogy by the current density method based on Fisher-Lee formula at the energy points E = ±0.42, ±1.06 and ±1.5, respectively, where the transmission spectra appear peaks. We find that the transmission spectra are related to the incident electronic energy and the molecular levels strongly and the current distributions agree well with Kirchhoff quantum current momentum conservation law.

  2. Predictive models based on sensitivity theory and their application to practical shielding problems

    SciTech Connect

    Bhuiyan, S.I.; Roussin, R.W.; Lucius, J.L.; Bartine, D.E.

    1983-01-01

    Two new calculational models based on the use of cross-section sensitivity coefficients have been devised for calculating radiation transport in relatively simple shields. The two models, one an exponential model and the other a power model, have been applied, together with the traditional linear model, to 1- and 2-m-thick concrete-slab problems in which the water content, reinforcing-steel content, or composition of the concrete was varied. Comparing the results obtained with the three models with those obtained from exact one-dimensional discrete-ordinates transport calculations indicates that the exponential model, named the BEST model (for basic exponential shielding trend), is a particularly promising predictive tool for shielding problems dominated by exponential attenuation. When applied to a deep-penetration sodium problem, the BEST model also yields better results than do calculations based on second-order sensitivity theory.

  3. Theory and practical application of blood-based renal replacement therapy.

    PubMed

    Murray, J S; Hinchliffe, W T; Kanagasundaram, N S

    2009-12-01

    The term renal replacement therapy incorporates three modalities that control or correct biochemical and fluid disturbances of renal failure. Peritoneal dialysis and renal transplantation are two forms of renal replacement therapy that are outside the remit of this article. This review focuses upon the third group which are blood-based and involve direct treatment of a patient's blood in a closed, extracorporeal circuit. They provide renal replacement for end-stage renal failure and during periods of severe acute kidney injury, and also for non-renal indications such as the management of drug overdoses. Blood-based renal replacement therapies are often loosely referred to as 'haemodialysis', although this is only one of a range of treatments. This article outlines the theory and practical applications of these treatments. PMID:20081630

  4. Can functionalized cucurbituril bind actinyl cations efficiently? A density functional theory based investigation.

    PubMed

    Sundararajan, Mahesh; Sinha, Vivek; Bandyopadhyay, Tusar; Ghosh, Swapan K

    2012-05-01

    The feasibility of using cucurbituril host molecule as a probable actinyl cation binders candidate is investigated through density functional theory based calculations. Various possible binding sites of the cucurbit[5]uril host molecule to uranyl are analyzed and based on the binding energy evaluations, μ(5)-binding is predicted to be favored. For this coordination, the structure, vibrational spectra, and binding energies are evaluated for the binding of three actinyls in hexa-valent and penta-valent oxidation states with functionalized cucurbiturils. Functionalizing cucurbituril with methyl and cyclohexyl groups increases the binding affinities of actinyls, whereas fluorination decreases the binding affinities as compared to the native host molecule. Surprisingly hydroxylation of the host molecule does not distinguish the oxidation state of the three actinyls. PMID:22471316

  5. Evaluation of Transverse Thermal Stresses in Composite Plates Based on First-Order Shear Deformation Theory

    NASA Technical Reports Server (NTRS)

    Rolfes, R.; Noor, A. K.; Sparr, H.

    1998-01-01

    A postprocessing procedure is presented for the evaluation of the transverse thermal stresses in laminated plates. The analytical formulation is based on the first-order shear deformation theory and the plate is discretized by using a single-field displacement finite element model. The procedure is based on neglecting the derivatives of the in-plane forces and the twisting moments, as well as the mixed derivatives of the bending moments, with respect to the in-plane coordinates. The calculated transverse shear stiffnesses reflect the actual stacking sequence of the composite plate. The distributions of the transverse stresses through-the-thickness are evaluated by using only the transverse shear forces and the thermal effects resulting from the finite element analysis. The procedure is implemented into a postprocessing routine which can be easily incorporated into existing commercial finite element codes. Numerical results are presented for four- and ten-layer cross-ply laminates subjected to mechanical and thermal loads.

  6. Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    1998-01-01

    A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.

  7. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies

    PubMed Central

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/. PMID:26313379

  8. A Neurosemantic Theory of Concrete Noun Representation Based on the Underlying Brain Codes

    PubMed Central

    Just, Marcel Adam; Cherkassky, Vladimir L.; Aryal, Sandesh; Mitchell, Tom M.

    2010-01-01

    This article describes the discovery of a set of biologically-driven semantic dimensions underlying the neural representation of concrete nouns, and then demonstrates how a resulting theory of noun representation can be used to identify simple thoughts through their fMRI patterns. We use factor analysis of fMRI brain imaging data to reveal the biological representation of individual concrete nouns like apple, in the absence of any pictorial stimuli. From this analysis emerge three main semantic factors underpinning the neural representation of nouns naming physical objects, which we label manipulation, shelter, and eating. Each factor is neurally represented in 3–4 different brain locations that correspond to a cortical network that co-activates in non-linguistic tasks, such as tool use pantomime for the manipulation factor. Several converging methods, such as the use of behavioral ratings of word meaning and text corpus characteristics, provide independent evidence of the centrality of these factors to the representations. The factors are then used with machine learning classifier techniques to show that the fMRI-measured brain representation of an individual concrete noun like apple can be identified with good accuracy from among 60 candidate words, using only the fMRI activity in the 16 locations associated with these factors. To further demonstrate the generativity of the proposed account, a theory-based model is developed to predict the brain activation patterns for words to which the algorithm has not been previously exposed. The methods, findings, and theory constitute a new approach of using brain activity for understanding how object concepts are represented in the mind. PMID:20084104

  9. Contextualization and standardization of the supportive leadership behavior questionnaire based on socio- cognitive theory in Iran

    PubMed Central

    Shirazi, Mandana; Emami, Amir Hosein; Mirmoosavi, ,Seyed Jamal; Alavinia, Seyed Mohammad; Zamanian, Hadi; Fathollahbeigi, Faezeh; Masiello, Italo

    2014-01-01

    Background: Effective leadership is of prime importance in any organization and it goes through changes based on accepted health promotion and behavior change theory. Although there are many leadership styles, transformational leadership, which emphasizes supportive leadership behaviors, seems to be an appropriate style in many settings particularly in the health care and educational sectors which are pressured by high turnover and safety demands. Iran has been moving rapidly forward and its authorities have understood and recognized the importance of matching leadership styles with effective and competent care for success in health care organizations. This study aimed to develop the Supportive Leadership Behaviors Scale based on accepted health and educational theories and to psychometrically test it in the Iranian context. Methods: The instrument was based on items from established questionnaires. A pilot study validated the instrument which was also cross-validated via re-translation. After validation, 731 participants answered the questionnaire. Results: The instrument was finalized and resulted in a 20-item questionnaire using the exploratory factor analysis, which yielded four factors of support for development, integrity, sincerity and recognition and explaining the supportive leadership behaviors (all above 0.6). Mapping these four measures of leadership behaviors can be beneficial to determine whether effective leadership could support innovation and improvements in medical education and health care organizations on the national level. The reliability measured as Cronbach’s alpha was 0.84. Conclusion: This new instrument yielded four factors of support for development, integrity, sincerity and recognition and explaining the supportive leadership behaviors which are applicable in health and educational settings and are helpful in improving self –efficacy among health and academic staff. PMID:25679004

  10. Programmatic assessment of competency-based workplace learning: when theory meets practice

    PubMed Central

    2013-01-01

    Background In competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice. Methods In a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned. Results The programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges. Conclusions A programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement. Careful preparation and guidance of the implementation process was crucial. Assessment for learning requires meaningful feedback with each assessment. Special attention should be paid to the quality of feedback at individual assessment moments. Comprehensive attention for faculty development and training for students is essential for the successful implementation of an assessment programme. PMID:24020944

  11. Revisiting Street Intersections Using Slot-Based Systems

    PubMed Central

    Tachet, Remi; Santi, Paolo; Sobolevsky, Stanislav; Reyes-Castro, Luis Ignacio; Frazzoli, Emilio; Helbing, Dirk; Ratti, Carlo

    2016-01-01

    Since their appearance at the end of the 19th century, traffic lights have been the primary mode of granting access to road intersections. Today, this centuries-old technology is challenged by advances in intelligent transportation, which are opening the way to new solutions built upon slot-based systems similar to those commonly used in aerial traffic: what we call Slot-based Intersections (SIs). Despite simulation-based evidence of the potential benefits of SIs, a comprehensive, analytical framework to compare their relative performance with traffic lights is still lacking. Here, we develop such a framework. We approach the problem in a novel way, by generalizing classical queuing theory. Having defined safety conditions, we characterize capacity and delay of SIs. In the 2-road crossing configuration, we provide a capacity-optimal SI management system. For arbitrary intersection configurations, near-optimal solutions are developed. Results theoretically show that transitioning from a traffic light system to SI has the potential of doubling capacity and significantly reducing delays. This suggests a reduction of non-linear dynamics induced by intersection bottlenecks, with positive impact on the road network. Such findings can provide transportation engineers and planners with crucial insights as they prepare to manage the transition towards a more intelligent transportation infrastructure in cities. PMID:26982532

  12. Revisiting Street Intersections Using Slot-Based Systems.

    PubMed

    Tachet, Remi; Santi, Paolo; Sobolevsky, Stanislav; Reyes-Castro, Luis Ignacio; Frazzoli, Emilio; Helbing, Dirk; Ratti, Carlo

    2016-01-01

    Since their appearance at the end of the 19th century, traffic lights have been the primary mode of granting access to road intersections. Today, this centuries-old technology is challenged by advances in intelligent transportation, which are opening the way to new solutions built upon slot-based systems similar to those commonly used in aerial traffic: what we call Slot-based Intersections (SIs). Despite simulation-based evidence of the potential benefits of SIs, a comprehensive, analytical framework to compare their relative performance with traffic lights is still lacking. Here, we develop such a framework. We approach the problem in a novel way, by generalizing classical queuing theory. Having defined safety conditions, we characterize capacity and delay of SIs. In the 2-road crossing configuration, we provide a capacity-optimal SI management system. For arbitrary intersection configurations, near-optimal solutions are developed. Results theoretically show that transitioning from a traffic light system to SI has the potential of doubling capacity and significantly reducing delays. This suggests a reduction of non-linear dynamics induced by intersection bottlenecks, with positive impact on the road network. Such findings can provide transportation engineers and planners with crucial insights as they prepare to manage the transition towards a more intelligent transportation infrastructure in cities. PMID:26982532

  13. Motivational cues predict the defensive system in team handball: A model based on regulatory focus theory.

    PubMed

    Debanne, T; Laffaye, G

    2015-08-01

    This study was based on the naturalistic decision-making paradigm and regulatory focus theory. Its aim was to model coaches' decision-making processes for handball teams' defensive systems based on relevant cues of the reward structure, and to determine the weight of each of these cues. We collected raw data by video-recording 41 games that were selected using a simple random method. We considered the defensive strategy (DEF: aligned or staged) to be the dependent variable, and the three independent variables were (a) numerical difference between the teams; (b) score difference between the teams; and (c) game periods. We used a logistic regression design (logit model) and a multivariate logistic model to explain the link between DEF and the three category independent variables. Each factor was weighted differently during the decision-making process to select the defensive system, and combining these variables increased the impact on this process; for instance, a staged defense is 43 times more likely to be chosen during the final period in an unfavorable situation and in a man advantage. Finally, this shows that the coach's decision-making process could be based on a simple match or could require a diagnosis of the situation based on the relevant cues. PMID:25262855

  14. Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.

    PubMed

    Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  15. Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm

    PubMed Central

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  16. Informational Theory of Aging: The Life Extension Method Based on the Bone Marrow Transplantation

    PubMed Central

    Karnaukhov, Alexey V.; Karnaukhova, Elena V.; Sergievich, Larisa A.; Karnaukhova, Natalia A.; Bogdanenko, Elena V.; Manokhina, Irina A.; Karnaukhov, Valery N.

    2015-01-01

    The method of lifespan extension that is a practical application of the informational theory of aging is proposed. In this theory, the degradation (error accumulation) of the genetic information in cells is considered a main cause of aging. According to it, our method is based on the transplantation of genetically identical (or similar) stem cells with the lower number of genomic errors to the old recipients. For humans and large mammals, this method can be realized by cryopreservation of their own stem cells, taken in a young age, for the later autologous transplantation in old age. To test this method experimentally, we chose laboratory animals of relatively short lifespan (mouse). Because it is difficult to isolate the required amount of the stem cells (e.g., bone marrow) without significant damage for animals, we used the bone marrow transplantation from sacrificed inbred young donors. It is shown that the lifespan extension of recipients depends on level of their genetic similarity (syngeneity) with donors. We have achieved the lifespan increase of the experimental mice by 34% when the transplantation of the bone marrow with high level of genetic similarity was used. PMID:26491435

  17. Investigations into Generalization of Constraint-Based Scheduling Theories with Applications to Space Telescope Observation Scheduling

    NASA Astrophysics Data System (ADS)

    Muscettola, Nicola; Smith, Steven S.

    1996-09-01

    This final report summarizes research performed under NASA contract NCC 2-531 toward generalization of constraint-based scheduling theories and techniques for application to space telescope observation scheduling problems. Our work into theories and techniques for solution of this class of problems has led to the development of the Heuristic Scheduling Testbed System (HSTS), a software system for integrated planning and scheduling. Within HSTS, planning and scheduling are treated as two complementary aspects of the more general process of constructing a feasible set of behaviors of a target system. We have validated the HSTS approach by applying it to the generation of observation schedules for the Hubble Space Telescope. This report summarizes the HSTS framework and its application to the Hubble Space Telescope domain. First, the HSTS software architecture is described, indicating (1) how the structure and dynamics of a system is modeled in HSTS, (2) how schedules are represented at multiple levels of abstraction, and (3) the problem solving machinery that is provided. Next, the specific scheduler developed within this software architecture for detailed management of Hubble Space Telescope operations is presented. Finally, experimental performance results are given that confirm the utility and practicality of the approach.

  18. General Formalism of Decision Making Based on Theory of Open Quantum Systems

    NASA Astrophysics Data System (ADS)

    Asano, M.; Ohya, M.; Basieva, I.; Khrennikov, A.

    2013-01-01

    We present the general formalism of decision making which is based on the theory of open quantum systems. A person (decision maker), say Alice, is considered as a quantum-like system, i.e., a system which information processing follows the laws of quantum information theory. To make decision, Alice interacts with a huge mental bath. Depending on context of decision making this bath can include her social environment, mass media (TV, newspapers, INTERNET), and memory. Dynamics of an ensemble of such Alices is described by Gorini-Kossakowski-Sudarshan-Lindblad (GKSL) equation. We speculate that in the processes of evolution biosystems (especially human beings) designed such "mental Hamiltonians" and GKSL-operators that any solution of the corresponding GKSL-equation stabilizes to a diagonal density operator (In the basis of decision making.) This limiting density operator describes population in which all superpositions of possible decisions has already been resolved. In principle, this approach can be used for the prediction of the distribution of possible decisions in human populations.

  19. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  20. Simulation of Two-Phase Flow Based on a Thermodynamically Constrained Averaging Theory Flow Model

    NASA Astrophysics Data System (ADS)

    Weigand, T. M.; Dye, A. L.; McClure, J. E.; Farthing, M. W.; Gray, W. G.; Miller, C. T.

    2014-12-01

    The thermodynamically constrained averaging theory (TCAT) has been used to formulate general classes of porous medium models, including new models for two-fluid-phase flow. The TCAT approach provides advantages that include a firm connection between the microscale, or pore scale, and the macroscale; a thermodynamically consistent basis; explicit inclusion of factors such as interfacial areas, contact angles, interfacial tension, and curvatures; and dynamics of interface movement and relaxation to an equilibrium state. In order to render the TCAT model solvable, certain closure relations are needed to relate fluid pressure, interfacial areas, curvatures, and relaxation rates. In this work, we formulate and solve a TCAT-based two-fluid-phase flow model. We detail the formulation of the model, which is a specific instance from a hierarchy of two-fluid-phase flow models that emerge from the theory. We show the closure problem that must be solved. Using recent results from high-resolution microscale simulations, we advance a set of closure relations that produce a closed model. Lastly, we use locally conservative spatial discretization and higher order temporal discretization methods to approximate the solution to this new model and compare the solution to the traditional model.

  1. A system model for ultrasonic NDT based on the Physical Theory of Diffraction (PTD).

    PubMed

    Darmon, M; Dorval, V; Kamta Djakou, A; Fradkin, L; Chatillon, S

    2016-01-01

    Simulation of ultrasonic Non Destructive Testing (NDT) is helpful for evaluating performances of inspection techniques and requires the modelling of waves scattered by defects. Two classical flaw scattering models have been previously usually employed and evaluated to deal with inspection of planar defects, the Kirchhoff approximation (KA) for simulating reflection and the Geometrical Theory of Diffraction (GTD) for simulating diffraction. Combining them so as to retain advantages of both, the Physical Theory of Diffraction (PTD) initially developed in electromagnetism has been recently extended to elastodynamics. In this paper a PTD-based system model is proposed for simulating the ultrasonic response of crack-like defects. It is also extended to provide good description of regions surrounding critical rays where the shear diffracted waves and head waves interfere. Both numerical and experimental validation of the PTD model is carried out in various practical NDT configurations, such as pulse echo and Time of Flight Diffraction (TOFD), involving both crack tip and corner echoes. Numerical validation involves comparison of this model with KA and GTD as well as the Finite-Element Method (FEM). PMID:26323548

  2. An integrated finite element simulation of cardiomyocyte function based on triphasic theory

    PubMed Central

    Hatano, Asuka; Okada, Jun-Ichi; Washio, Takumi; Hisada, Toshiaki; Sugiura, Seiryo

    2015-01-01

    In numerical simulations of cardiac excitation-contraction coupling, the intracellular potential distribution and mobility of cytosol and ions have been mostly ignored. Although the intracellular potential gradient is small, during depolarization it can be a significant driving force for ion movement, and is comparable to diffusion in terms of net flux. Furthermore, fluid in the t-tubules is thought to advect ions to facilitate their exchange with the extracellular space. We extend our previous finite element model that was based on triphasic theory to examine the significance of these factors in cardiac physiology. Triphasic theory allows us to study the behavior of solids (proteins), fluids (cytosol) and ions governed by mechanics and electrochemistry in detailed subcellular structures, including myofibrils, mitochondria, the sarcoplasmic reticulum, membranes, and t-tubules. Our simulation results predicted an electrical potential gradient inside the t-tubules at the onset of depolarization, which corresponded to the Na+ channel distribution therein. Ejection and suction of fluid between the t-tubules and the extracellular compartment during isometric contraction were observed. We also examined the influence of t-tubule morphology and mitochondrial location on the electrophysiology and mechanics of the cardiomyocyte. Our results confirm that the t-tubule structure is important for synchrony of Ca2+ release, and suggest that mitochondria in the sub-sarcolemmal region might serve to cancel Ca2+ inflow through surface sarcolemma, thereby maintaining the intracellular Ca2+ environment in equilibrium. PMID:26539124

  3. Competitive effects between stationary chemical reaction centres: a theory based on off-center monopoles.

    PubMed

    Biello, Joseph A; Samson, René

    2015-03-01

    The subject of this paper is competitive effects between multiple reaction sinks. A theory based on off-center monopoles is developed for the steady-state diffusion equation and for the convection-diffusion equation with a constant flow field. The dipolar approximation for the diffusion equation with two equal reaction centres is compared with the exact solution. The former turns out to be remarkably accurate, even for two touching spheres. Numerical evidence is presented to show that the same holds for larger clusters (with more than two spheres). The theory is extended to the convection-diffusion equation with a constant flow field. As one increases the convective velocity, the competitive effects between the reactive centres gradually become less significant. This is demonstrated for a number of cluster configurations. At high flow velocities, the current methodology breaks down. Fixing this problem will be the subject of future research. The current method is useful as an easy-to-use tool for the calibration of other more complicated models in mass and/or heat transfer. PMID:25747063

  4. The use of theory based semistructured elicitation questionnaires: formative research for CDC's Prevention Marketing Initiative.

    PubMed Central

    Middlestadt, S E; Bhattacharyya, K; Rosenbaum, J; Fishbein, M; Shepherd, M

    1996-01-01

    Through one of its many HIV prevention programs, the Prevention Marketing Initiative, the Centers for Disease Control and Prevention promotes a multifaceted strategy for preventing the sexual transmission of HIV/AIDS among people less than 25 years of age. The Prevention Marketing Initiative is an application of marketing and consumer-oriented technologies that rely heavily on behavioral research and behavior change theories to bring the behavioral and social sciences to bear on practical program planning decisions. One objective of the Prevention Marketing Initiative is to encourage consistent and correct condom use among sexually active young adults. Qualitative formative research is being conducted in several segments of the population of heterosexually active, unmarried young adults between 18 and 25 using a semistructured elicitation procedure to identify and understand underlying behavioral determinants of consistent condom use. The purpose of this paper is to illustrate the use of this type of qualitative research methodology in designing effective theory-based behavior change interventions. Issues of research design and data collection and analysis are discussed. To illustrate the methodology, results of content analyses of selected responses to open-ended questions on consistent condom use are presented by gender (male, female), ethnic group (white, African American), and consistency of condom use (always, sometimes). This type of formative research can be applied immediately to designing programs and is invaluable for valid and relevant larger-scale quantitative research. PMID:8862153

  5. New Approach to Optimize the Apfs Placement Based on Instantaneous Reactive Power Theory by Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Hashemi-Dezaki, Hamed; Mohammadalizadeh-Shabestary, Masoud; Askarian-Abyaneh, Hossein; Rezaei-Jegarluei, Mohammad

    2014-01-01

    In electrical distribution systems, a great amount of power are wasting across the lines, also nowadays power factors, voltage profiles and total harmonic distortions (THDs) of most loads are not as would be desired. So these important parameters of a system play highly important role in wasting money and energy, and besides both consumers and sources are suffering from a high rate of distortions and even instabilities. Active power filters (APFs) are innovative ideas for solving of this adversity which have recently used instantaneous reactive power theory. In this paper, a novel method is proposed to optimize the allocation of APFs. The introduced method is based on the instantaneous reactive power theory in vectorial representation. By use of this representation, it is possible to asses different compensation strategies. Also, APFs proper placement in the system plays a crucial role in either reducing the losses costs and power quality improvement. To optimize the APFs placement, a new objective function has been defined on the basis of five terms: total losses, power factor, voltage profile, THD and cost. Genetic algorithm has been used to solve the optimization problem. The results of applying this method to a distribution network illustrate the method advantages.

  6. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    SciTech Connect

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elastic wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.

  7. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    DOE PAGESBeta

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less

  8. Cartographic generalization of urban street networks based on gravitational field theory

    NASA Astrophysics Data System (ADS)

    Liu, Gang; Li, Yongshu; Li, Zheng; Guo, Jiawei

    2014-05-01

    The automatic generalization of urban street networks is a constant and important aspect of geographical information science. Previous studies show that the dual graph for street-street relationships more accurately reflects the overall morphological properties and importance of streets than do other methods. In this study, we construct a dual graph to represent street-street relationship and propose an approach to generalize street networks based on gravitational field theory. We retain the global structural properties and topological connectivity of an original street network and borrow from gravitational field theory to define the gravitational force between nodes. The concept of multi-order neighbors is introduced and the gravitational force is taken as the measure of the importance contribution between nodes. The importance of a node is defined as the result of the interaction between a given node and its multi-order neighbors. Degree distribution is used to evaluate the level of maintaining the global structure and topological characteristics of a street network and to illustrate the efficiency of the suggested method. Experimental results indicate that the proposed approach can be used in generalizing street networks and retaining their density characteristics, connectivity and global structure.

  9. Investigations into Generalization of Constraint-Based Scheduling Theories with Applications to Space Telescope Observation Scheduling

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Smith, Steven S.

    1996-01-01

    This final report summarizes research performed under NASA contract NCC 2-531 toward generalization of constraint-based scheduling theories and techniques for application to space telescope observation scheduling problems. Our work into theories and techniques for solution of this class of problems has led to the development of the Heuristic Scheduling Testbed System (HSTS), a software system for integrated planning and scheduling. Within HSTS, planning and scheduling are treated as two complementary aspects of the more general process of constructing a feasible set of behaviors of a target system. We have validated the HSTS approach by applying it to the generation of observation schedules for the Hubble Space Telescope. This report summarizes the HSTS framework and its application to the Hubble Space Telescope domain. First, the HSTS software architecture is described, indicating (1) how the structure and dynamics of a system is modeled in HSTS, (2) how schedules are represented at multiple levels of abstraction, and (3) the problem solving machinery that is provided. Next, the specific scheduler developed within this software architecture for detailed management of Hubble Space Telescope operations is presented. Finally, experimental performance results are given that confirm the utility and practicality of the approach.

  10. Evaluation of a social cognitive theory-based yoga intervention to reduce anxiety.

    PubMed

    Mehta, Purvi; Sharma, Manoj

    Yoga is often viewed as a form of alternative and complementary medicine, as it strives to achieve equilibrium between the body and mind that aids healing. Studies have shown the beneficial role of yoga in anxiety reduction. The purpose of this study was to design and evaluate a 10-week social cognitive theory based yoga intervention to reduce anxiety. The yoga intervention utilized the constructs of behavioral capability, expectations, self-efficacy for yoga from social cognitive theory, and included asanas (postures), pranayama (breathing techniques), shava asana (relaxation), and dhyana (meditation). A one-between and one-within group, quasi-experimental design was utilized for evaluation. Scales measuring expectations from yoga, self-efficacy for yoga, and Speilberger's State Trait Anxiety Inventory, were administered before and after the intervention. Repeated measures analyses of variance (ANOVA) were performed to compare pre-test and post-test scores in the two groups. Yoga as an approach shows promising results for anxiety reduction. PMID:23353562

  11. Effective meson masses in nuclear matter based on a cutoff field theory

    SciTech Connect

    Nakano, M.; Noda, N.; Mitsumori, T.; Koide, K.; Kouno, H.; Hasegawa, A.

    1997-02-01

    Effective masses of {sigma}, {omega}, {pi}, and {rho} mesons in nuclear matter are calculated based on a cutoff field theory. Instead of the traditional density-Feynman representation, we adopt the particle-hole-antiparticle representation for nuclear propagators so that unphysical components are not included in the meson self-energies. For an estimation of the contribution from the divergent particle-antiparticle excitations, i.e., vacuum polarization in nuclear matter, the idea of the renormalization group method is adopted. In this cutoff field theory, all the counterterms are finite and calculated numerically. It is shown that the predicted meson masses converge even if the cutoff {Lambda} is changed as long as {Lambda} is sufficiently large and that the prescription works well also for so-called nonrenormalized mesons such as {pi} and {rho}. According to this method, it is concluded that meson masses in nuclear matter have a weak dependence on the baryon density. {copyright} {ital 1997} {ital The American Physical Society}

  12. Gradient projection for reliability-based design optimization using evidence theory

    NASA Astrophysics Data System (ADS)

    Alyanak, Edward; Grandhi, Ramana; Bae, Ha-Rok

    2008-10-01

    Uncertainty quantification and risk assessment in the optimal design of structural systems has always been a critical consideration for engineers. When new technologies are developed or implemented and budgets are limited for full-scale testing, the result is insufficient datasets for construction of probability distributions. Making assumptions about these probability distributions can potentially introduce more uncertainty to the system than it quantifies. Evidence theory represents a method to handle epistemic uncertainty that represents a lack of knowledge or information in the numerical optimization process. Therefore, it is a natural tool to use for uncertainty quantification and risk assessment especially in the optimization design cycle for future aerospace structures where new technologies are being applied. For evidence theory to be recognized as a useful tool, it must be efficiently applied in a robust design optimization scheme. This article demonstrates a new method for projecting the reliability gradient, based on the measures of belief and plausibility, without gathering any excess information other than what is required to determine these measures. This represents a huge saving in computational time over other methods available in the current literature. The technique developed in this article is demonstrated with three optimization examples.

  13. Nonlinear free vibration of a microscale beam based on modified couple stress theory

    NASA Astrophysics Data System (ADS)

    Wang, Yong-Gang; Lin, Wen-Hui; Liu, Ning

    2013-01-01

    This paper presents a nonlinear free vibration analysis of the microbeams based on the modified couple stress Euler-Bernoulli beam theory and von Kármán geometrically nonlinear theory. The governing differential equations are established in variational form from Hamilton principle, with a material length scale parameter to interpret the size effect. These partial differential equations are reduced to corresponding ordinary ones by eliminating the time variable with the Kantorovich method following an assumed harmonic time mode. The resulting equations, which form a nonlinear two-point boundary value problem in spatial variable, are then solved numerically by shooting method, and the size-dependent characteristic relations of nonlinear vibration frequency vs. central amplitude of the microbeams are obtained successfully. The comparisons with available published results show that the current approach and algorithm are of good practicability. A parametric study is conducted involving the dependency of the frequency on the length scale parameter along with Poisson ratio, which shows that the nonlinear vibration frequency predicted by the current model is higher than that by the classical one.

  14. Where are family theories in family-based obesity treatment?: conceptualizing the study of families in pediatric weight management

    PubMed Central

    Skelton, JA; Buehler, C; Irby, MB; Grzywacz, JG

    2014-01-01

    Family-based approaches to pediatric obesity treatment are considered the ‘gold-standard,’ and are recommended for facilitating behavior change to improve child weight status and health. If family-based approaches are to be truly rooted in the family, clinicians and researchers must consider family process and function in designing effective interventions. To bring a better understanding of family complexities to family-based treatment, two relevant reviews were conducted and are presented: (1) a review of prominent and established theories of the family that may provide a more comprehensive and in-depth approach for addressing pediatric obesity; and (2) a systematic review of the literature to identify the use of prominent family theories in pediatric obesity research, which found little use of theories in intervention studies. Overlapping concepts across theories include: families are a system, with interdependence of units; the idea that families are goal-directed and seek balance; and the physical and social environment imposes demands on families. Family-focused theories provide valuable insight into the complexities of families. Increased use of these theories in both research and practice may identify key leverage points in family process and function to prevent the development of or more effectively treat obesity. The field of family studies provides an innovative approach to the difficult problem of pediatric obesity, building on the long-established approach of family-based treatment. PMID:22531090

  15. Pulsed photothermal deflection and diffraction effects: numerical modeling based on Fresnel diffraction theory

    NASA Astrophysics Data System (ADS)

    Han, Yue; Wu, Z. L.; Rosenshein, Joseph S.; Thomsen, Marshall; Zhao, Qiang; Moncur, Kent

    1999-12-01

    We present a comprehensive theoretical model suitable for treating the effect of pulsed collinear photothermal deflection spectroscopy (PDS). The work is an extension of the theoretical model previously developed for the mirage effect, which can take into account both photothermal deflection and photothermal diffraction effects based on the Fresnel diffraction theory. With the diffraction model, both the collinear PDS and the photothermal lensing spectroscopy techniques can be treated in a unified manner. The model provides a detailed analysis of the laser-induced optical diffraction effect and can be used to optimize experimental parameters. The modeled results are presented in detail, with an emphasis on the advantages of using a near-field detection scheme for achieving the best sensitivity to local temperature change and better experimental stability against environmental noise.

  16. Evaluation of social vulnerability to floods in Huaihe River basin: a methodology based on catastrophe theory

    NASA Astrophysics Data System (ADS)

    You, W. J.; Zhang, Y. L.

    2015-08-01

    Huaihe River is one of the seven largest rivers in China, in which floods occurred frequently. Disasters cause huge casualties and property losses to the basin, and also make it famous for high social vulnerability to floods. Based on the latest social-economic data, the index system of social vulnerability to floods was constructed, and Catastrophe theory method was used in the assessment process. The conclusion shows that social vulnerability as a basic attribute attached to urban environment, with significant changes from city to city across the Huaihe River basin. Different distribution characteristics are present in population, economy, flood prevention vulnerability. It is important to make further development of social vulnerability, which will play a positive role in disaster prevention, improvement of comprehensive ability to respond to disasters.

  17. Study on corporate social responsibility evaluation system based on stakeholder theory

    NASA Astrophysics Data System (ADS)

    Ma, J.; Deng, Liming

    2011-10-01

    The issue of Corporate Social Responsibility (CSR) has been attracting the attention from many disciplines such as economics, management, laws, sociality and philosophy since last century. The purpose of this study is to explore the impact of CSR on performance and develop a CSR evaluation system. Building on the definition of CSR and Stakeholder theory, this article built a path-relationship model of CSR and business operation performance. The paper also constructed CSR evaluation system based on KLD index, GRJ report, CSR accounting account, SA8000, ISO14000 etc. The research provides a basis for future studies about the relationship between CSR and business performance and shed some light on the evaluation of CSR practices.

  18. Dissolved oxygen prediction using a possibility-theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, U. T.; Valeo, C.

    2015-11-01

    A new fuzzy neural network method to predict minimum dissolved oxygen (DO) concentration in a highly urbanised riverine environment (in Calgary, Canada) is proposed. The method uses abiotic (non-living, physical and chemical attributes) as inputs to the model, since the physical mechanisms governing DO in the river are largely unknown. A new two-step method to construct fuzzy numbers using observations is proposed. Then an existing fuzzy neural network is modified to account for fuzzy number inputs and also uses possibility-theory based intervals to train the network. Results demonstrate that the method is particularly well suited to predict low DO events in the Bow River. Model output and a defuzzification technique is used to estimate the risk of low DO so that water resource managers can implement strategies to prevent the occurrence of low DO.

  19. Detection and control of combustion instability based on the concept of dynamical system theory.

    PubMed

    Gotoda, Hiroshi; Shinoda, Yuta; Kobayashi, Masaki; Okuno, Yuta; Tachibana, Shigeru

    2014-02-01

    We propose an online method of detecting combustion instability based on the concept of dynamical system theory, including the characterization of the dynamic behavior of combustion instability. As an important case study relevant to combustion instability encountered in fundamental and practical combustion systems, we deal with the combustion dynamics close to lean blowout (LBO) in a premixed gas-turbine model combustor. The relatively regular pressure fluctuations generated by thermoacoustic oscillations transit to low-dimensional intermittent chaos owing to the intermittent appearance of burst with decreasing equivalence ratio. The translation error, which is characterized by quantifying the degree of parallelism of trajectories in the phase space, can be used as a control variable to prevent LBO. PMID:25353548

  20. A review of return-stroke models based on transmission line theory

    NASA Astrophysics Data System (ADS)

    De Conti, Alberto; Silveira, Fernando H.; Visacro, Silvério; Cardoso, Thiago C. M.

    2015-12-01

    This paper presents a review of lightning return-stroke models based on transmission line theory. The reviewed models are classified in three different categories, namely discharge-type, lumped-excitation, and parameter-estimation models. An attempt is made to address the difficulties that some models experience in reproducing directly or indirectly observable features of lightning, such as current characteristics and remote electromagnetic fields. It is argued that most of these difficulties are related to a poor discretization of the lightning channel, to inconsistencies in the calculation of per-unit-length channel parameters, to uncertainties in the representation of the upper end of the channel, and to assuming an ideal switch to connect the channel to ground in the transition from leader to return stroke. Applications of transmission line return-stroke models are also outlined.