Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time
Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.
2017-12-20
In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.
Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.
In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.
The Utility of Robust Means in Statistics
ERIC Educational Resources Information Center
Goodwyn, Fara
2012-01-01
Location estimates calculated from heuristic data were examined using traditional and robust statistical methods. The current paper demonstrates the impact outliers have on the sample mean and proposes robust methods to control for outliers in sample data. Traditional methods fail because they rely on the statistical assumptions of normality and…
ERIC Educational Resources Information Center
Stewart, Kelise K.; Carr, James E.; Brandt, Charles W.; McHenry, Meade M.
2007-01-01
The present study evaluated the effects of both a traditional lecture and the conservative dual-criterion (CDC) judgment aid on the ability of 6 university students to visually inspect AB-design line graphs. The traditional lecture reliably failed to improve visual inspection accuracy, whereas the CDC method substantially improved the performance…
Why is traditional accounting failing managers?
Cokins, G
1998-11-01
This article provides an account of activity-based costing. It presents a general overview of this costing method, lists benefits and key concerns, discusses some of the impediments to its spread, and predicts its increasing use.
Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F
2005-12-08
This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.
Williams, Gary E.; Wood, P.B.
2002-01-01
We used miniature infrared video cameras to monitor Wood Thrush (Hylocichla mustelina) nests during 1998–2000. We documented nest predators and examined whether evidence at nests can be used to predict predator identities and nest fates. Fifty-six nests were monitored; 26 failed, with 3 abandoned and 23 depredated. We predicted predator class (avian, mammalian, snake) prior to review of video footage and were incorrect 57% of the time. Birds and mammals were underrepresented whereas snakes were over-represented in our predictions. We documented ≥9 nest-predator species, with the southern flying squirrel (Glaucomys volans) taking the most nests (n = 8). During 2000, we predicted fate (fledge or fail) of 27 nests; 23 were classified correctly. Traditional methods of monitoring nests appear to be effective for classifying success or failure of nests, but ineffective at classifying nest predators.
New Choices: Career Planning in a Changing World.
ERIC Educational Resources Information Center
Borchard, David C.
1984-01-01
As society enters the post-industrial age, career choosers and changers have more options than ever before. Yet traditional methods used by career counselors fail to take into account the rapid changes affecting the world of work. (RM)
Vera, Angelina M; Russo, Michael; Mohsin, Adnan; Tsuda, Shawn
2014-12-01
Laparoscopic skills training has evolved over recent years. However, conveying a mentor's directions using conventional methods, without realistic on-screen visual cues, can be difficult and confusing. To facilitate laparoscopic skill transference, an augmented reality telementoring (ART) platform was designed to overlay the instruments of a mentor onto the trainee's laparoscopic monitor. The aim of this study was to compare the effectiveness of this new teaching modality to traditional methods in novices performing an intracorporeal suturing task. Nineteen pre-medical and medical students were randomized into traditional mentoring (n = 9) and ART (n = 10) groups for a laparoscopic suturing and knot-tying task. Subjects received either traditional mentoring or ART for 1 h on the validated fundamentals of laparoscopic surgery intracorporeal suturing task. Tasks for suturing were recorded and scored for time and errors. Results were analyzed using means, standard deviation, power regression analysis, correlation coefficient, analysis of variance, and student's t test. Using Wright's cumulative average model (Y = aX (b)) the learning curve slope was significantly steeper, demonstrating faster skill acquisition, for the ART group (b = -0.567, r (2) = 0.92) than the control group (b = -0.453, r (2) = 0.74). At the end of 10 repetitions or 1 h of practice, the ART group was faster versus traditional (mean 167.4 vs. 242.4 s, p = 0.014). The ART group also had fewer fails (8) than the traditional group (13). The ART Platform may be a more effective training technique in teaching laparoscopic skills to novices compared to traditional methods. ART conferred a shorter learning curve, which was more pronounced in the first 4 trials. ART reduced the number of failed attempts and resulted in faster suture times by the end of the training session. ART may be a more effective training tool in laparoscopic surgical training for complex tasks than traditional methods.
Monge, Paul
2006-01-01
Activity-based methods serve as a dynamic process that has allowed many other industries to reduce and control their costs, increase productivity, and streamline their processes while improving product quality and service. The method could serve the healthcare industry in an equally beneficial way. Activity-based methods encompass both activity based costing (ABC) and activity-based management (ABM). ABC is a cost management approach that links resource consumption to activities that an enterprise performs, and then assigns those activities and their associated costs to customers, products, or product lines. ABM uses the resource assignments derived in ABC so that operation managers can improve their departmental processes and workflows. There are three fundamental problems with traditional cost systems. First, traditional systems fail to reflect the underlying diversity of work taking place within an enterprise. Second, it uses allocations that are, for the most part, arbitrary Single step allocations fail to reflect the real work-the activities being performed and the associate resources actually consumed. Third, they only provide a cost number that, standing alone, does not provide any guidance on how to improve performance by lowering cost or enhancing throughput.
Improving Students' Critical Thinking, Creativity, and Communication Skills
ERIC Educational Resources Information Center
Geissler, Gary L.; Edison, Steve W.; Wayland, Jane P.
2012-01-01
Business professors continue to face the challenge of truly preparing their students for the workplace. College students often lack skills that are valued by employers, such as critical thinking, creativity, communication, conflict resolution, and teamwork skills. Traditional classroom methods, such as lectures, may fail to produce adequate…
A Holistic Approach to Evaluating Vocational Education: Traditional Chinese Physicians (TCP) Model.
ERIC Educational Resources Information Center
Lee, Lung-Sheng; Chang, Liang-Te
Conventional approaches to evaluating vocational education have often been criticized for failing to deal holistically with the institution or program being evaluated. Integrated quantitative and qualitative evaluation methods have documented benefits; therefore, it would be useful to consider possibility of developing a model for evaluating…
Using Participatory Photo Novels to Teach Marketing
ERIC Educational Resources Information Center
Das, Kallol
2012-01-01
Teaching the restless young generation business students of today is not easy. Furthermore, the traditional lecture method has failed miserably to engage the business students and deliver significant learning. The author presents a discussion on the photo novel as an attractive communication medium and the participatory photo novel as an…
Komissarov, Igor Alexeevich; Borisova, Natalia Alexandrovna; Komissarov, Michail Igorevich; Aleshin, Ivan Jurievich
2018-06-01
Dieulafoy disease can manifest itself with spontaneous massive recurrent gastrointestinal bleeding in children. We report a case of successful management of a 13-month-old child with Dieulafoy disease of duodenum when traditional methods of examination and treatment failed.
Teaching to Strengths: Engaging Young Boys in Learning
ERIC Educational Resources Information Center
Johnson, Cynthia; Gooliaff, Shauna
2013-01-01
Traditional teaching methods often fail to engage male students in learning. The purpose of this research was to increase student engagement in the story writing process and increase self-confidence in boys at risk. A qualitative approach included student surveys as well as teacher journaling and portfolios (including e-portfolios). The student…
Group Comparisons of Mathematics Performance from a Cognitive Diagnostic Perspective
ERIC Educational Resources Information Center
Chen, Yi-Hsin; Ferron, John M.; Thompson, Marilyn S.; Gorin, Joanna S.; Tatsuoka, Kikumi K.
2010-01-01
Traditional comparisons of test score means identify group differences in broad academic areas, but fail to provide substantive description of how the groups differ on the specific cognitive attributes required for success in the academic area. The rule space method (RSM) allows for group comparisons at the cognitive attribute level, which…
Study on process evaluation model of students' learning in practical course
NASA Astrophysics Data System (ADS)
Huang, Jie; Liang, Pei; Shen, Wei-min; Ye, Youxiang
2017-08-01
In practical course teaching based on project object method, the traditional evaluation methods include class attendance, assignments and exams fails to give incentives to undergraduate students to learn innovatively and autonomously. In this paper, the element such as creative innovation, teamwork, document and reporting were put into process evaluation methods, and a process evaluation model was set up. Educational practice shows that the evaluation model makes process evaluation of students' learning more comprehensive, accurate, and fairly.
Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures.
Costa, Madalena D; Peng, Chung-Kang; Goldberger, Ary L
2008-06-01
Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and non-equilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools--multiscale entropy and multiscale time irreversibility--are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs.
Multiscale Analysis of Heart Rate Dynamics: Entropy and Time Irreversibility Measures
Peng, Chung-Kang; Goldberger, Ary L.
2016-01-01
Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and nonequilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools— multiscale entropy and multiscale time irreversibility—are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs. PMID:18172763
The research on the mean shift algorithm for target tracking
NASA Astrophysics Data System (ADS)
CAO, Honghong
2017-06-01
The traditional mean shift algorithm for target tracking is effective and high real-time, but there still are some shortcomings. The traditional mean shift algorithm is easy to fall into local optimum in the tracking process, the effectiveness of the method is weak when the object is moving fast. And the size of the tracking window never changes, the method will fail when the size of the moving object changes, as a result, we come up with a new method. We use particle swarm optimization algorithm to optimize the mean shift algorithm for target tracking, Meanwhile, SIFT (scale-invariant feature transform) and affine transformation make the size of tracking window adaptive. At last, we evaluate the method by comparing experiments. Experimental result indicates that the proposed method can effectively track the object and the size of the tracking window changes.
Simulation Study of Effects of the Blind Deconvolution on Ultrasound Image
NASA Astrophysics Data System (ADS)
He, Xingwu; You, Junchen
2018-03-01
Ultrasonic image restoration is an essential subject in Medical Ultrasound Imaging. However, without enough and precise system knowledge, some traditional image restoration methods based on the system prior knowledge often fail to improve the image quality. In this paper, we use the simulated ultrasound image to find the effectiveness of the blind deconvolution method for ultrasound image restoration. Experimental results demonstrate that the blind deconvolution method can be applied to the ultrasound image restoration and achieve the satisfactory restoration results without the precise prior knowledge, compared with the traditional image restoration method. And with the inaccurate small initial PSF, the results shows blind deconvolution could improve the overall image quality of ultrasound images, like much better SNR and image resolution, and also show the time consumption of these methods. it has no significant increasing on GPU platform.
Real-Time Culture Change Improves Lean Success: Sequenced Culture Change Gets Failing Grades.
Kusy, Mitchell; Diamond, Marty; Vrchota, Scott
2015-01-01
Success with the Lean management system is rooted in a culture of stakeholder engagement and commitment. Unfortunately, many leaders view Lean as an "add-on" tool instead of one that requires a new way of thinking and approaching culture. This article addresses the "why, how, and what" to promote a Lean culture that works. We present a five-phased approach grounded in evidence-based practices of real-time culture change. We further help healthcare leaders understand the differences between traditional "sequenced" approaches to culture change and "real-time" methods--and why these real-time practices are more sustainable and ultimately more successful than traditional culture change methods.
Rationale and Suggestions for Emphasizing Afrocentricity in the Public Schools.
ERIC Educational Resources Information Center
Honeman, Bob
Public schools are facing the largest crisis in American educational history. More and more minority students are attending public schools, and traditional methods of instruction are failing to meet their needs. The back-to-basics and belt-tightening movements within public schooling are part of a last-ditch effort to support an elitist education…
Accelerated Stress-Corrosion Testing
NASA Technical Reports Server (NTRS)
1986-01-01
Test procedures for accelerated stress-corrosion testing of high-strength aluminum alloys faster and provide more quantitative information than traditional pass/fail tests. Method uses data from tests on specimen sets exposed to corrosive environment at several levels of applied static tensile stress for selected exposure times then subsequently tensile tested to failure. Method potentially applicable to other degrading phenomena (such as fatigue, corrosion fatigue, fretting, wear, and creep) that promote development and growth of cracklike flaws within material.
Traditional Project Management and the Visual Workplace Environment to Improve Project Success
ERIC Educational Resources Information Center
Fichera, Christopher E.
2016-01-01
A majority of large IT projects fail to meet scheduled deadlines, are over budget and do not satisfy the end user. Many projects fail in spite of utilizing traditional project management techniques. Research of project management has not identified the use of a visual workspace as a feature affecting or influencing the success of a project during…
NASA Technical Reports Server (NTRS)
Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey
1993-01-01
Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.
ERIC Educational Resources Information Center
Idowu, Olumuyiwa Ayodeji
2013-01-01
Over the past 2 years, almost 45% of the students attending a local suburban high school failed Algebra 2. The purpose of this study was to compare the impact of a cooperative instructional technique (student teams-achievement divisions [STAD]) to traditional instructional methods on performance in high school algebra. Motivational and cognitive…
Detecting spatial regimes in ecosystems | Science Inventory ...
Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory based method, on both terrestrial and aquatic animal data (US Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps, and multivariate analysis such as nMDS (non-metric Multidimensional Scaling) and cluster analysis. We successfully detect spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change. Use an information theory based method to identify ecological boundaries and compare our results to traditional early warning
Significance of Social Applications on a Mobile Phone for English Task-Based Language Learning
ERIC Educational Resources Information Center
Ahmad, Anmol; Farrukh, Fizza
2015-01-01
The utter importance of knowing the English language cannot be denied today. Despite the existence of traditional methods for teaching a language in schools, a big number of children are left without the requisite knowledge of English as a result of which they fail to compete in the modern world. With English being a Lingua Franca, more efforts…
ERIC Educational Resources Information Center
Gökçe, Semirhan; Yenmez, Arzu Aydogan; Özpinar, Ilknur
2017-01-01
Recent developments in technology have changed the learner's profile and the learning outcomes. Today, with the emergence of higher-order thinking skills and computer literacy skills, teaching through traditional methods is likely to fail to achieve the learning outcomes. That is why; teachers and teacher candidates are expected to have computer…
SDN solutions for switching dedicated long-haul connections: Measurements and comparative analysis
Rao, Nageswara S. V.
2016-01-01
We consider a scenario of two sites connected over a dedicated, long-haul connection that must quickly fail-over in response to degradations in host-to-host application performance. The traditional layer-2/3 hot stand-by fail-over solutions do not adequately address the variety of application degradations, and more recent single controller Software Defined Networks (SDN) solutions are not effective for long-haul connections. We present two methods for such a path fail-over using OpenFlow enabled switches: (a) a light-weight method that utilizes host scripts to monitor application performance and dpctl API for switching, and (b) a generic method that uses two OpenDaylight (ODL) controllers and RESTmore » interfaces. For both methods, the restoration dynamics of applications contain significant statistical variations due to the complexities of controllers, north bound interfaces and switches; they, together with the wide variety of vendor implementations, complicate the choice among such solutions. We develop the impulse-response method based on regression functions of performance parameters to provide a rigorous and objective comparison of different solutions. We describe testing results of the two proposed methods, using TCP throughput and connection rtt as main parameters, over a testbed consisting of HP and Cisco switches connected over longhaul connections emulated in hardware by ANUE devices. Lastly, the combination of analytical and experimental results demonstrate that the dpctl method responds seconds faster than the ODL method on average, even though both methods eventually restore original TCP throughput.« less
Personalized medicine: a confluence of traditional and contemporary medicine.
Jafari, Samineh; Abdollahi, Mohammad; Saeidnia, Soodabeh
2014-01-01
Traditional systems of medicine have attained great popularity among patients in recent years. Success of this system in the treatment of disease warrants consideration, particularly in cases for which conventional medicine has been insufficient. This study investigates the similarities in principles and approaches of 3 traditional systems and explores whether conventional medicine is able to exploit the advantages of traditional systems. This study first identifies and explores the advantages of 3 well-known systems-traditional Iranian medicine (TIM), ayurveda, and traditional Chinese medicine (TCM)-that are similar in their basic principles and methods. Second, it clarifies whether and how conventional medicine could exploit the advantages of traditional systems as it modernizes, to become more personalized. Finally, this study investigates the possibility that conventional medicine could benefit from traditional typology to improve its personalization. The acknowledgment of the unity of humans and nature, applying rational methods, and personalized approaches is fundamentally similar in the 3 systems. Additionally, they all promote the holistic view that health is harmony and disease is disharmony of the body. Other similarities include their recognition of the unique nature of every person and their categorization of people into different body types. Although conventional medicine has mostly failed to incorporate the advantages of traditional medicine, its integration with traditional medicine is achievable. For instance, exploiting traditional typologies in genomic and other studies may facilitate personalization of conventional medicine. From its review, the research team concludes that prospects are bright for the integration of traditional and conventional medicines and, consequently, for a dramatic improvement in health systems.
Image gathering and restoration - Information and visual quality
NASA Technical Reports Server (NTRS)
Mccormick, Judith A.; Alter-Gartenberg, Rachel; Huck, Friedrich O.
1989-01-01
A method is investigated for optimizing the end-to-end performance of image gathering and restoration for visual quality. To achieve this objective, one must inevitably confront the problems that the visual quality of restored images depends on perceptual rather than mathematical considerations and that these considerations vary with the target, the application, and the observer. The method adopted in this paper is to optimize image gathering informationally and to restore images interactively to obtain the visually preferred trade-off among fidelity resolution, sharpness, and clarity. The results demonstrate that this method leads to significant improvements in the visual quality obtained by the traditional digital processing methods. These traditional methods allow a significant loss of visual quality to occur because they treat the design of the image-gathering system and the formulation of the image-restoration algorithm as two separate tasks and fail to account for the transformations between the continuous and the discrete representations in image gathering and reconstruction.
Dressel, Anne; Schneider, Robert; DeNomie, Melissa; Kusch, Jennifer; Welch, Whitney; Sosa, Mirtha; Yeldell, Sally; Maida, Tatiana; Wineberg, Jessica; Holt, Keith; Bernstein, Rebecca
2017-09-01
Most low-income Americans fail to meet physical activity recommendations. Inactivity and poor diet contribute to obesity, a risk factor for multiple chronic diseases. Health promotion activities have the potential to improve health outcomes for low-income populations. Measuring the effectiveness of these activities, however, can be challenging in community settings. A "Biking for Health" study tested the impact of a bicycling intervention on overweight or obese low-income Latino and African American adults to reduce barriers to cycling and increase physical activity and fitness. A randomized controlled trial was conducted in Milwaukee, Wisconsin, in summer 2015. A 12-week bicycling intervention was implemented at two sites with low-income, overweight, or obese Latino and African American adults. We found that randomized controlled trial methodology was suboptimal for use in this small pilot study and that it negatively affected participation. More discussion is needed about the effectiveness of using traditional research methods in community settings to assess the effectiveness of health promotion interventions. Modifications or alternative methods may yield better results. The aim of this article is to discuss the effectiveness and feasibility of using traditional research methods to assess health promotion interventions in community-based settings.
A saltwater flotation technique to identify unincubated eggs
Devney, C.A.; Kondrad, S.L.; Stebbins, K.R.; Brittingham, K.D.; Hoffman, D.J.; Heinz, G.H.
2009-01-01
Field studies on nesting birds sometimes involve questions related to nest initiation dates, length of the incubation period, or changes in parental incubation behavior during various stages of incubation. Some of this information can be best assessed when a nest is discovered before the eggs have undergone any incubation, and this has traditionally been assessed by floating eggs in freshwater. However, because the freshwater method is not particularly accurate in identifying unincubated eggs, we developed a more reliable saltwater flotation method. The saltwater method involves diluting a saturated saltwater solution with freshwater until a salt concentration is reached where unincubated eggs sink to the bottom and incubated eggs float to the surface. For Laughing Gulls (Leucophaeus atricilla), floating eggs in freshwater failed to identify 39.0% (N = 251) of eggs that were subsequently found by candling to have undergone incubation prior to collection. By contrast, in a separate collection of gull eggs, no eggs that passed the saltwater test (N = 225) were found by a later candling to have been incubated prior to collection. For Double-crested Cormorants (Phalacrocorax auritus), floating eggs in freshwater failed to identify 15.6% (N = 250) of eggs that had undergone incubation prior to collection, whereas in a separate collection, none of the eggs that passed the saltwater test (N = 85) were found by a later candling to have been incubated prior to collection. Immersion of eggs in saltwater did not affect embryo survival. Although use of the saltwater method is likely limited to colonial species and requires calibrating a saltwater solution, it is a faster and more accurate method of identifying unincubated eggs than the traditional method of floating eggs in freshwater.
The Public and Private Domains: Intellectual Property Rights in Traditional Knowledge.
ERIC Educational Resources Information Center
Dutfield, Graham
2000-01-01
Defines traditional knowledge and its uses and argues that intellectual property law contains an in-built bias that protects the intangible assets of companies while failing to recognize traditional knowledge as protectable subject matter. Discusses the globalization of intellectual property rights and the unjust situation for indigenous people.…
Achieving real-time capsule endoscopy (CE) video visualization through panoramic imaging
NASA Astrophysics Data System (ADS)
Yi, Steven; Xie, Jean; Mui, Peter; Leighton, Jonathan A.
2013-02-01
In this paper, we mainly present a novel and real-time capsule endoscopy (CE) video visualization concept based on panoramic imaging. Typical CE videos run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. To date, there is no commercially available tool capable of providing stabilized and processed CE video that is easy to analyze in real time. The burden on physicians' disease finding efforts is thus big. In fact, since the CE camera sensor has a limited forward looking view and low image frame rate (typical 2 frames per second), and captures very close range imaging on the GI tract surface, it is no surprise that traditional visualization method based on tracking and registration often fails to work. This paper presents a novel concept for real-time CE video stabilization and display. Instead of directly working on traditional forward looking FOV (field of view) images, we work on panoramic images to bypass many problems facing traditional imaging modalities. Methods on panoramic image generation based on optical lens principle leading to real-time data visualization will be presented. In addition, non-rigid panoramic image registration methods will be discussed.
A Novel Approach to Rotorcraft Damage Tolerance
NASA Technical Reports Server (NTRS)
Forth, Scott C.; Everett, Richard A.; Newman, John A.
2002-01-01
Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.
NASA Astrophysics Data System (ADS)
Shaw, Stephen B.; Walter, M. Todd
2009-03-01
The Soil Conservation Service curve number (SCS-CN) method is widely used to predict storm runoff for hydraulic design purposes, such as sizing culverts and detention basins. As traditionally used, the probability of calculated runoff is equated to the probability of the causative rainfall event, an assumption that fails to account for the influence of variations in soil moisture on runoff generation. We propose a modification to the SCS-CN method that explicitly incorporates rainfall return periods and the frequency of different soil moisture states to quantify storm runoff risks. Soil moisture status is assumed to be correlated to stream base flow. Fundamentally, this approach treats runoff as the outcome of a bivariate process instead of dictating a 1:1 relationship between causative rainfall and resulting runoff volumes. Using data from the Fall Creek watershed in western New York and the headwaters of the French Broad River in the mountains of North Carolina, we show that our modified SCS-CN method improves frequency discharge predictions in medium-sized watersheds in the eastern United States in comparison to the traditional application of the method.
NASA Astrophysics Data System (ADS)
Lei, Sen; Zou, Zhengxia; Liu, Dunge; Xia, Zhenghuan; Shi, Zhenwei
2018-06-01
Sea-land segmentation is a key step for the information processing of ocean remote sensing images. Traditional sea-land segmentation algorithms ignore the local similarity prior of sea and land, and thus fail in complex scenarios. In this paper, we propose a new sea-land segmentation method for infrared remote sensing images to tackle the problem based on superpixels and multi-scale features. Considering the connectivity and local similarity of sea or land, we interpret the sea-land segmentation task in view of superpixels rather than pixels, where similar pixels are clustered and the local similarity are explored. Moreover, the multi-scale features are elaborately designed, comprising of gray histogram and multi-scale total variation. Experimental results on infrared bands of Landsat-8 satellite images demonstrate that the proposed method can obtain more accurate and more robust sea-land segmentation results than the traditional algorithms.
Detecting spatial regimes in ecosystems
Sundstrom, Shana M.; Eason, Tarsha; Nelson, R. John; Angeler, David G.; Barichievy, Chris; Garmestani, Ahjond S.; Graham, Nicholas A.J.; Granholm, Dean; Gunderson, Lance; Knutson, Melinda; Nash, Kirsty L.; Spanbauer, Trisha; Stow, Craig A.; Allen, Craig R.
2017-01-01
Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory-based method, on both terrestrial and aquatic animal data (U.S. Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps and multivariate analyses such as nMDS and cluster analysis. We successfully detected spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change.
NASA Technical Reports Server (NTRS)
Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.
1984-01-01
A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.
NASA Astrophysics Data System (ADS)
Pacheco-Sanchez, Anibal; Claus, Martin; Mothes, Sven; Schröter, Michael
2016-11-01
Three different methods for the extraction of the contact resistance based on both the well-known transfer length method (TLM) and two variants of the Y-function method have been applied to simulation and experimental data of short- and long-channel CNTFETs. While for TLM special CNT test structures are mandatory, standard electrical device characteristics are sufficient for the Y-function methods. The methods have been applied to CNTFETs with low and high channel resistance. It turned out that the standard Y-function method fails to deliver the correct contact resistance in case of a relatively high channel resistance compared to the contact resistances. A physics-based validation is also given for the application of these methods based on applying traditional Si MOSFET theory to quasi-ballistic CNTFETs.
Solutions for Failing High Schools: Converging Visions and Promising Models.
ERIC Educational Resources Information Center
Legters, Nettie; Balfanz, Robert; McPartland, James
Promising solutions to the failings of traditional comprehensive high schools were reviewed to identify basic principles and strategies for improving high schools nationwide. Selected research studies, policy documents, and promising high school programs were reviewed. The review revealed the following principles for helping high schools better…
2012-05-01
Muslims to liberate them from human rulers and their false laws, values, and traditions. Like other major religions , Islam has several sects. The...costs of failing to do so. They approach the development of strategy in their writings in a sec- ular tone that sets religion aside (but does not take...and methods of engagement are in accord with operational realit[ y ]… that calls for the independent formation and activation of [global] resistance
Multi-Dimensional Asymptotically Stable 4th Order Accurate Schemes for the Diffusion Equation
NASA Technical Reports Server (NTRS)
Abarbanel, Saul; Ditkowski, Adi
1996-01-01
An algorithm is presented which solves the multi-dimensional diffusion equation on co mplex shapes to 4th-order accuracy and is asymptotically stable in time. This bounded-error result is achieved by constructing, on a rectangular grid, a differentiation matrix whose symmetric part is negative definite. The differentiation matrix accounts for the Dirichlet boundary condition by imposing penalty like terms. Numerical examples in 2-D show that the method is effective even where standard schemes, stable by traditional definitions fail.
ERIC Educational Resources Information Center
Dong, Xuan
2008-01-01
17 textbooks are examined for the quantity and quality of their material pertaining to ideas in the symbolic interaction tradition. Most of the textbooks fail to discuss at least some of the ideas in this tradition. In the 5 exceptions, the texts only include material from the Chicago school of this tradition with only a little inclusive…
Liu, Rui; Chen, Pei; Aihara, Kazuyuki; Chen, Luonan
2015-01-01
Identifying early-warning signals of a critical transition for a complex system is difficult, especially when the target system is constantly perturbed by big noise, which makes the traditional methods fail due to the strong fluctuations of the observed data. In this work, we show that the critical transition is not traditional state-transition but probability distribution-transition when the noise is not sufficiently small, which, however, is a ubiquitous case in real systems. We present a model-free computational method to detect the warning signals before such transitions. The key idea behind is a strategy: “making big noise smaller” by a distribution-embedding scheme, which transforms the data from the observed state-variables with big noise to their distribution-variables with small noise, and thus makes the traditional criteria effective because of the significantly reduced fluctuations. Specifically, increasing the dimension of the observed data by moment expansion that changes the system from state-dynamics to probability distribution-dynamics, we derive new data in a higher-dimensional space but with much smaller noise. Then, we develop a criterion based on the dynamical network marker (DNM) to signal the impending critical transition using the transformed higher-dimensional data. We also demonstrate the effectiveness of our method in biological, ecological and financial systems. PMID:26647650
Melting Frozen Droplets Using Photo-Thermal Traps
NASA Astrophysics Data System (ADS)
Dash, Susmita; de Ruiter, Jolet; Varanasi, Kripa
2017-11-01
Ice buildup is an operational and safety hazard in wind turbines, power lines, and airplanes. While traditional de-icing methods are energy-intensive or environmentally unfriendly, passive anti-icing approach using superhydrophobic surfaces fails under humid conditions, which necessitates development of passive deicing methods. Here, we investigate a passive technique for deicing using a multi-layer surface design that can efficiently absorb and convert the incident solar radiation to heat. The corresponding increase in substrate temperature allows for easy removal of frozen droplets from the surface. We demonstrate the deicing performance of the designed surface both at very low temperatures, and under frost and snow coverage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giarra, Matthew N.; Charonko, John J.; Vlachos, Pavlos P.
Traditional particle image velocimetry (PIV) uses discrete Cartesian cross correlations (CCs) to estimate the displacements of groups of tracer particles within small subregions of sequentially captured images. However, these CCs fail in regions with large velocity gradients or high rates of rotation. In this paper, we propose a new PIV correlation method based on the Fourier–Mellin transformation (FMT) that enables direct measurement of the rotation and dilation of particle image patterns. In previously unresolvable regions of large rotation, our algorithm significantly improves the velocity estimates compared to traditional correlations by aligning the rotated and stretched particle patterns prior to performingmore » Cartesian correlations to estimate their displacements. Furthermore, our algorithm, which we term Fourier–Mellin correlation (FMC), reliably measures particle pattern displacement between pairs of interrogation regions with up to ±180° of angular misalignment, compared to 6–8° for traditional correlations, and dilation/compression factors of 0.5–2.0, compared to 0.9–1.1 for a single iteration of traditional correlations.« less
Giarra, Matthew N.; Charonko, John J.; Vlachos, Pavlos P.
2015-02-05
Traditional particle image velocimetry (PIV) uses discrete Cartesian cross correlations (CCs) to estimate the displacements of groups of tracer particles within small subregions of sequentially captured images. However, these CCs fail in regions with large velocity gradients or high rates of rotation. In this paper, we propose a new PIV correlation method based on the Fourier–Mellin transformation (FMT) that enables direct measurement of the rotation and dilation of particle image patterns. In previously unresolvable regions of large rotation, our algorithm significantly improves the velocity estimates compared to traditional correlations by aligning the rotated and stretched particle patterns prior to performingmore » Cartesian correlations to estimate their displacements. Furthermore, our algorithm, which we term Fourier–Mellin correlation (FMC), reliably measures particle pattern displacement between pairs of interrogation regions with up to ±180° of angular misalignment, compared to 6–8° for traditional correlations, and dilation/compression factors of 0.5–2.0, compared to 0.9–1.1 for a single iteration of traditional correlations.« less
Nodal Diffusion Burnable Poison Treatment for Prismatic Reactor Cores
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. M. Ougouag; R. M. Ferrer
2010-10-01
The prismatic block version of the High Temperature Reactor (HTR) considered as a candidate Very High Temperature Reactor (VHTR)design may use burnable poison pins in locations at some corners of the fuel blocks (i.e., assembly equivalent structures). The presence of any highly absorbing materials, such as these burnable poisons, within fuel blocks for hexagonal geometry, graphite-moderated High Temperature Reactors (HTRs) causes a local inter-block flux depression that most nodal diffusion-based method have failed to properly model or otherwise represent. The location of these burnable poisons near vertices results in an asymmetry in the morphology of the assemblies (or blocks). Hencemore » the resulting inadequacy of traditional homogenization methods, as these “spread” the actually local effect of the burnable poisons throughout the assembly. Furthermore, the actual effect of the burnable poison is primarily local with influence in its immediate vicinity, which happens to include a small region within the same assembly as well as similar regions in the adjacent assemblies. Traditional homogenization methods miss this artifact entirely. This paper presents a novel method for treating the local effect of the burnable poison explicitly in the context of a modern nodal method.« less
Bringing Culture into Parent Training with Latinos
ERIC Educational Resources Information Center
Calzada, Esther J.
2010-01-01
Traditional frameworks of parenting have failed to capture the distinctive nature of parenting in Latino families. Cultural values likely influence parenting practices. The study of cultural values may allow us to identify aspects of parenting that are unique to Latinos and which complement traditional frameworks of parenting. This paper presents…
Froneman, Salome; Kapp, Paul A
2017-10-13
The practice of traditional circumcision is associated with considerable morbidity and mortality, yet there is a paucity of literature that provides an understanding of the cultural values that influence men to choose traditional rather than medical circumcision.The aim of this study was to better understand the culture surrounding traditional circumcision, with a view to addressing morbidity and mortality rates associated with the Xhosa male initiation rituals.We explored Xhosa men's perceptions regarding the need for the risks and the social pressure to undergo traditional circumcision, the impact of non-initiation or failed initiation and the perceived barriers to obtaining medical help for the complications of traditional circumcisions. Individual in-depth interviews were conducted with 10 purposively sampled teenagers and adult men. The interviews were recorded, translated, transcribed and analysed using the framework method. Traditional circumcision was seen as essential to Xhosa culture. Participants rationalised many reasons for participating, including personal growth and development, family and peer pressure, independence and knowledge gained, a connection with ancestors and initiation into manhood. Despite publicity of the dangers of traditional circumcision and the hardships they have to endure, most young men still saw this process as necessary and worthwhile. Traditional initiation and circumcision are here to stay. The majority of boys still trust the elders and supernatural processes to guide them. However, some participants welcomed government initiatives to reduce human error causing unnecessary death and suffering. Current systems to prevent morbidity and mortality are insufficient and should be prioritised.
Artifon, Everson L.A.; Loureiro, Jarbas F.; Baron, Todd H.; Fernandes, Kaie; Kahaleh, Michel; Marson, Fernando P.
2015-01-01
Background and Objectives: Endoscopic retrograde cholangiopancreatography (ERCP) is the method of choice for drainage in patients with distal malignant biliary obstruction, but it fails in up to 10% of cases. Percutaneous transhepatic cholangiography (PTC) and surgical bypass are the traditional drainage alternatives. This study aimed to compare technical and clinical success, quality of life, and survival of surgical biliary bypass or hepaticojejunostomy (HJT) and endoscopic ultrasound (EUS)-guided choledochoduodenostomy (CDT) in patients with distal malignant bile duct obstruction and failed ERCP. Patients and Methods: A prospective, randomized trial was conducted. From March 2011 to September 2013, 32 patients with malignant distal biliary obstruction and failed ERCP were studied. The HJT group consisted of 15 patients and the CDT group consisted of 14 patients. Technical and clinical success, quality of life, and survival were assessed prospectively. Results: Technical success was 94% (15/16) in the HJT group and 88% (14/16) in the CDT group (P = 0.598). Clinical success occurred in 14 (93%) patients in the HJT group and in 10 (71%) patients in the CDT group (P = 0.169). During follow-up, a statistically significant difference was seen in mean functional capacity scores, physical health, pain, social functioning, and emotional and mental health aspects in both techniques (P < 0.05). The median survival time in both groups was the same (82 days). Conclusion: Data relating to technical and clinical success, quality of life, and survival were similar in patients who underwent HJT and CDT drainage after failed ERCP for malignant distal biliary obstruction. PMID:26374583
Blind restoration of retinal images degraded by space-variant blur with adaptive blur estimation
NASA Astrophysics Data System (ADS)
Marrugo, Andrés. G.; Millán, María. S.; Å orel, Michal; Å roubek, Filip
2013-11-01
Retinal images are often degraded with a blur that varies across the field view. Because traditional deblurring algorithms assume the blur to be space-invariant they typically fail in the presence of space-variant blur. In this work we consider the blur to be both unknown and space-variant. To carry out the restoration, we assume that in small regions the space-variant blur can be approximated by a space-invariant point-spread function (PSF). However, instead of deblurring the image on a per-patch basis, we extend individual PSFs by linear interpolation and perform a global restoration. Because the blind estimation of local PSFs may fail we propose a strategy for the identification of valid local PSFs and perform interpolation to obtain the space-variant PSF. The method was tested on artificial and real degraded retinal images. Results show significant improvement in the visibility of subtle details like small blood vessels.
Zhang, Junhua; Wang, Yuanyuan; Shi, Xinling
2009-12-01
A modified graph cut was proposed under the elliptical shape constraint to segment cervical lymph nodes on sonograms, and its effect on the measurement of short axis to long axis ratio (S/L) was investigated by using the relative ultimate measurement accuracy (RUMA). Under the same user inputs, the proposed algorithm successfully segmented all 60 sonograms tested, while the traditional graph cut failed. The mean RUMA resulted from the developed method was comparable to that resulted from the manual segmentation. Results indicated that utilizing the elliptical shape prior could appreciably improve the graph cut for nodes segmentation, and the proposed method satisfied the accuracy requirement of S/L measurement.
A suspended dive-net technique for catching territorial divers
Uher-Koch, Brian D.; Rizzolo, Daniel; Wright, Kenneth G.; Schmutz, Joel A.
2016-01-01
A variety of methods such as night-lighting and lift nets have been used to catch divers (Gavidae), although 24-hour daylight in the Arctic summer and the remote nature of field sites can make the use of these traditional methods impossible. Our research required capture of adult divers at remote locations in northern Alaska. Here we describe a suspended dive-net technique that we used to safely capture territorial White-billed Gavia adamsii and Pacific Divers G. pacifica and that is lightweight and easy to set up. We also were able to capture divers with chicks, and failed breeders, and suggest that this method could be used to catch other territorial aquatic diving birds, especially other diver species.
Whose Confession? Which Tradition? (A Preliminary Critique of Penny Thompson, 2004)
ERIC Educational Resources Information Center
Doble, Peter
2005-01-01
What does Penny Thompson really want? Reading her article in "BJRE" 26 (1) proved a baffling experience: it clearly wanted to say something, and to say it passionately, yet signally failed to do so. It fails largely because it lacks an argument; there seems also to be conceptual muddle at its heart. A fuller critique will need to attend…
Self-organizing biopsychosocial dynamics and the patient-healer relationship.
Pincus, David
2012-01-01
The patient-healer relationship has an increasing area of interest for complementary and alternative medicine (CAM) researchers. This focus on the interpersonal context of treatment is not surprising as dismantling studies, clinical trials and other linear research designs continually point toward the critical role of context and the broadband biopsychosocial nature of therapeutic responses to CAM. Unfortunately, the same traditional research models and methods that fail to find simple and specific treatment-outcome relations are similarly failing to find simple and specific mechanisms to explain how interpersonal processes influence patient outcomes. This paper presents an overview of some of the key models and methods from nonlinear dynamical systems that are better equipped for empirical testing of CAM outcomes on broadband biopsychosocial processes. Suggestions are made for CAM researchers to assist in modeling the interactions among key process dynamics interacting across biopsychosocial scales: empathy, intra-psychic conflict, physiological arousal, and leukocyte telomerase activity. Finally, some speculations are made regarding the possibility for deeper cross-scale information exchange involving quantum temporal nonlocality. Copyright © 2012 S. Karger AG, Basel.
Beyond Objectivity and Subjectivity: The Intersubjective Foundations of Psychological Science.
Mascolo, Michael F
2016-12-01
The question of whether psychology can properly be regarded as a science has long been debated (Smedslund in Integrative Psychological & Behavioral Science, 50, 185-195, 2016). Science is typically understood as a method for producing reliable knowledge by testing falsifiable claims against objective evidence. Psychological phenomena, however, are traditionally taken to be "subjective" and hidden from view. To the extent that science relies upon objective observation, is a scientific psychology possible? In this paper, I argue that scientific psychology does not much fail to meet the requirements of objectivity as much as the concept of objectivity fails as a methodological principle for psychological science. The traditional notion of objectivity relies upon the distinction between a public, observable exterior and a private, subjective interior. There are good reasons, however, to reject this dichotomy. Scholarship suggests that psychological knowledge arises neither from the "inside out" (subjectively) nor from the outside-in (objectively), but instead intersubjective processes that occur between people. If this is so, then objectivist methodology may do more to obscure than illuminate our understanding of psychological functioning. From this view, we face a dilemma: Do we, in the name of science, cling to an objective epistemology that cuts us off from the richness of psychological activity? Or do we seek to develop a rigorous intersubjective psychology that exploits the processes through which we gain psychological knowledge in the first place? If such a psychology can produce systematic, reliable and useful knowledge, then the question of whether its practices are "scientific" in the traditional sense would become irrelevant.
Grammar Is Back, but When Will We Start Cooking?
ERIC Educational Resources Information Center
Vavra, Ed
2003-01-01
Suggests that the current "return" to grammar will fail unless educators can come to terms with definitions of fundamental grammatical concepts. Considers how educators cannot go back to teaching the traditional, because the traditional no longer exists. Argues that pedagogical grammar currently has too many cooks, all trying to prepare the same…
The Legitimization of the Radical Tradition in France, 1789-1901.
ERIC Educational Resources Information Center
Gough, Hugh
1988-01-01
Traces the development of radicalism from the 1789 French Revolution to the present. States that radical philosophy has its roots in rationalism and Enlightenment thought and was linked to positivism during the nineteenth century. Despite the failings of radicalism and the Radical Party, the radical tradition set precedents for current political…
Critical Pedagogy, Cultural Politics, and the Discourse of Experience.
ERIC Educational Resources Information Center
Giroux, Henry A.
1985-01-01
Analyzes how traditional and liberal discourses treat the intersection of culture, power, and knowledge in fashioning a view of teaching and learning. Argues that both traditions fail as modes of critical pedagogy and that it is necessary to develop a critical discourse that embraces pedagogy as a form of cultural politics. (Author/GC)
We Must Face It: PDSs Have Failed to Innovate
ERIC Educational Resources Information Center
Waters, Richard
2017-01-01
A big picture perspective on the PDS movement reveals a failure to innovate in teacher learning. The vast majority of PDS schools are traditional schools of industrial age design which serve to induct teachers into the profession as traditional classroom teachers thereby neglecting the development of teacher agency, teacher collaboration, and new…
Tech versus the Human Touch: Teacher Affect Is More Effective.
ERIC Educational Resources Information Center
Perry, Alan
2003-01-01
An experimental group studied Macbeth in an independent, constructivist setting using multimedia; the control group studied traditionally. Eleven of 23 experimental students and 2 of 21 in the traditional class failed. In an experiment with Hamlet, the results were reversed. Students were most successful when the teacher was actively involved,…
McLean, Michelle
2004-01-01
To canvas perceptions and experiences of students who had failed Year 2 of a traditional medical program and who chose to remain in the conventional program (n = 6) or had swapped to Curriculum 2001 (C2001), a problem-based learning (PBL) curriculum (n = 14). A year after their decision regarding curriculum choice, students were canvassed (largely open-ended survey) about this decision and about their perceptions of their curricular experiences. C2001 students were positive about their PBL experiences. Overwhelmingly, their decision to swap streams had been a good one. They identified PBL features as supporting their learning. Repeating traditional curriculum students were, however, more circumspect in their opinions. C2001 students had clearly embraced PBL. They were now medical students, largely because of PBL activities underpinned by a sound educational philosophy. This unique case study has provided additional evidence that PBL students are generally more content with their studies than their conventional curriculum counterparts.
Simulation as a surgical teaching model.
Ruiz-Gómez, José Luis; Martín-Parra, José Ignacio; González-Noriega, Mónica; Redondo-Figuero, Carlos Godofredo; Manuel-Palazuelos, José Carlos
2018-01-01
Teaching of surgery has been affected by many factors over the last years, such as the reduction of working hours, the optimization of the use of the operating room or patient safety. Traditional teaching methodology fails to reduce the impact of these factors on surgeońs training. Simulation as a teaching model minimizes such impact, and is more effective than traditional teaching methods for integrating knowledge and clinical-surgical skills. Simulation complements clinical assistance with training, creating a safe learning environment where patient safety is not affected, and ethical or legal conflicts are avoided. Simulation uses learning methodologies that allow teaching individualization, adapting it to the learning needs of each student. It also allows training of all kinds of technical, cognitive or behavioural skills. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Mikeš, Daniel
2010-05-01
Theoretical geology Present day geology is mostly empirical of nature. I claim that geology is by nature complex and that the empirical approach is bound to fail. Let's consider the input to be the set of ambient conditions and the output to be the sedimentary rock record. I claim that the output can only be deduced from the input if the relation from input to output be known. The fundamental question is therefore the following: Can one predict the output from the input or can one predict the behaviour of a sedimentary system? If one can, than the empirical/deductive method has changes, if one can't than that method is bound to fail. The fundamental problem to solve is therefore the following: How to predict the behaviour of a sedimentary system? It is interesting to observe that this question is never asked and many a study is conducted by the empirical/deductive method; it seems that the empirical method has been accepted as being appropriate without question. It is, however, easy to argument that a sedimentary system is by nature complex and that several input parameters vary at the same time and that they can create similar output in the rock record. It follows trivially from these first principles that in such a case the deductive solution cannot be unique. At the same time several geological methods depart precisely from the assumption, that one particular variable is the dictator/driver and that the others are constant, even though the data do not support such an assumption. The method of "sequence stratigraphy" is a typical example of such a dogma. It can be easily argued that all the interpretation resulting from a method that is built on uncertain or wrong assumptions is erroneous. Still, this method has survived for many years, nonwithstanding all the critics it has received. This is just one example of the present day geological world and is not unique. Even the alternative methods criticising sequence stratigraphy actually depart from the same erroneous assumptions and do not solve the very fundamental issue that lies at the base of the problem. This problem is straighforward and obvious: a sedimentary system is inherently four-dimensional (3 spatial dimensions + 1 temporal dimension). Any method using an inferior number or dimensions is bound to fail to describe the evolution of a sedimentary system. It is indicative of the present day geological world that such fundamental issues be overlooked. The only reason for which one can appoint the socalled "rationality" in todays society. Simple "common sense" leads us to the conclusion that in this case the empirical method is bound to fail and the only method that can solve the problem is the theoretical approach. Reasoning that is completely trivial for the traditional exact sciences like physics and mathematics and applied sciences like engineering. However, not for geology, a science that was traditionally descriptive and jumped to empirical science, skipping the stage of theoretical science. I argue that the gap of theoretical geology is left open and needs to be filled. Every discipline in geology lacks a theoretical base. This base can only be filled by the theoretical/inductive approach and can impossibly be filled by the empirical/deductive approach. Once a critical mass of geologists realises this flaw in todays geology, we can start solving the fundamental problems in geology.
Re-Imagining School Leadership Preparation to Restore a Failing School District: A Case Study
ERIC Educational Resources Information Center
Lightfoot, Jonathan; Thompson, Eustace
2014-01-01
This case study report will identify modifications made to a traditional leadership program's structures and the effects of the work on faculty perceptions of non-traditional doctoral programs. Union Free School District is the only school district to ever be taken over by the state. A nearby university's research-based educational leadership…
Success Rates of Online versus Traditional College Students
ERIC Educational Resources Information Center
Wilson, Dawn; Allen, David
2011-01-01
Are students setting themselves up for failure by taking online courses? Should students be restricted from taking online courses if they have not reached a certain GPA? Should students who fail or withdraw from an online course be required to take to traditional courses for at least one semester? At one Historically Black College or University…
Extracting insights from the shape of complex data using topology
Lum, P. Y.; Singh, G.; Lehman, A.; Ishkanov, T.; Vejdemo-Johansson, M.; Alagappan, M.; Carlsson, J.; Carlsson, G.
2013-01-01
This paper applies topological methods to study complex high dimensional data sets by extracting shapes (patterns) and obtaining insights about them. Our method combines the best features of existing standard methodologies such as principal component and cluster analyses to provide a geometric representation of complex data sets. Through this hybrid method, we often find subgroups in data sets that traditional methodologies fail to find. Our method also permits the analysis of individual data sets as well as the analysis of relationships between related data sets. We illustrate the use of our method by applying it to three very different kinds of data, namely gene expression from breast tumors, voting data from the United States House of Representatives and player performance data from the NBA, in each case finding stratifications of the data which are more refined than those produced by standard methods. PMID:23393618
Extracting insights from the shape of complex data using topology.
Lum, P Y; Singh, G; Lehman, A; Ishkanov, T; Vejdemo-Johansson, M; Alagappan, M; Carlsson, J; Carlsson, G
2013-01-01
This paper applies topological methods to study complex high dimensional data sets by extracting shapes (patterns) and obtaining insights about them. Our method combines the best features of existing standard methodologies such as principal component and cluster analyses to provide a geometric representation of complex data sets. Through this hybrid method, we often find subgroups in data sets that traditional methodologies fail to find. Our method also permits the analysis of individual data sets as well as the analysis of relationships between related data sets. We illustrate the use of our method by applying it to three very different kinds of data, namely gene expression from breast tumors, voting data from the United States House of Representatives and player performance data from the NBA, in each case finding stratifications of the data which are more refined than those produced by standard methods.
Sové, Richard J; Drakos, Nicole E; Fraser, Graham M; Ellis, Christopher G
2018-05-25
Red blood cell oxygen saturation is an important indicator of oxygen supply to tissues in the body. Oxygen saturation can be measured by taking advantage of spectroscopic properties of hemoglobin. When this technique is applied to transmission microscopy, the calculation of saturation requires determination of incident light intensity at each pixel occupied by the red blood cell; this value is often approximated from a sequence of images as the maximum intensity over time. This method often fails when the red blood cells are moving too slowly, or if hematocrit is too large since there is not a large enough gap between the cells to accurately calculate the incident intensity value. A new method of approximating incident light intensity is proposed using digital inpainting. This novel approach estimates incident light intensity with an average percent error of approximately 3%, which exceeds the accuracy of the maximum intensity based method in most cases. The error in incident light intensity corresponds to a maximum error of approximately 2% saturation. Therefore, though this new method is computationally more demanding than the traditional technique, it can be used in cases where the maximum intensity-based method fails (e.g. stationary cells), or when higher accuracy is required. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Active learning increases student performance in science, engineering, and mathematics.
Freeman, Scott; Eddy, Sarah L; McDonough, Miles; Smith, Michelle K; Okoroafor, Nnadozie; Jordt, Hannah; Wenderoth, Mary Pat
2014-06-10
To test the hypothesis that lecturing maximizes learning and course performance, we metaanalyzed 225 studies that reported data on examination scores or failure rates when comparing student performance in undergraduate science, technology, engineering, and mathematics (STEM) courses under traditional lecturing versus active learning. The effect sizes indicate that on average, student performance on examinations and concept inventories increased by 0.47 SDs under active learning (n = 158 studies), and that the odds ratio for failing was 1.95 under traditional lecturing (n = 67 studies). These results indicate that average examination scores improved by about 6% in active learning sections, and that students in classes with traditional lecturing were 1.5 times more likely to fail than were students in classes with active learning. Heterogeneity analyses indicated that both results hold across the STEM disciplines, that active learning increases scores on concept inventories more than on course examinations, and that active learning appears effective across all class sizes--although the greatest effects are in small (n ≤ 50) classes. Trim and fill analyses and fail-safe n calculations suggest that the results are not due to publication bias. The results also appear robust to variation in the methodological rigor of the included studies, based on the quality of controls over student quality and instructor identity. This is the largest and most comprehensive metaanalysis of undergraduate STEM education published to date. The results raise questions about the continued use of traditional lecturing as a control in research studies, and support active learning as the preferred, empirically validated teaching practice in regular classrooms.
Active learning increases student performance in science, engineering, and mathematics
Freeman, Scott; Eddy, Sarah L.; McDonough, Miles; Smith, Michelle K.; Okoroafor, Nnadozie; Jordt, Hannah; Wenderoth, Mary Pat
2014-01-01
To test the hypothesis that lecturing maximizes learning and course performance, we metaanalyzed 225 studies that reported data on examination scores or failure rates when comparing student performance in undergraduate science, technology, engineering, and mathematics (STEM) courses under traditional lecturing versus active learning. The effect sizes indicate that on average, student performance on examinations and concept inventories increased by 0.47 SDs under active learning (n = 158 studies), and that the odds ratio for failing was 1.95 under traditional lecturing (n = 67 studies). These results indicate that average examination scores improved by about 6% in active learning sections, and that students in classes with traditional lecturing were 1.5 times more likely to fail than were students in classes with active learning. Heterogeneity analyses indicated that both results hold across the STEM disciplines, that active learning increases scores on concept inventories more than on course examinations, and that active learning appears effective across all class sizes—although the greatest effects are in small (n ≤ 50) classes. Trim and fill analyses and fail-safe n calculations suggest that the results are not due to publication bias. The results also appear robust to variation in the methodological rigor of the included studies, based on the quality of controls over student quality and instructor identity. This is the largest and most comprehensive metaanalysis of undergraduate STEM education published to date. The results raise questions about the continued use of traditional lecturing as a control in research studies, and support active learning as the preferred, empirically validated teaching practice in regular classrooms. PMID:24821756
Simple method for quantifying microbiologically assisted chloramine decay in drinking water.
Sathasivan, Arumugam; Fisher, Ian; Kastl, George
2005-07-15
In a chloraminated drinking water distribution system, monochloramine decays due to chemical and microbiological reactions. For modeling and operational control purposes, it is necessary to know the relative contribution of each type of reaction, but there was no method to quantify these contributions separately. A simple method was developed to do so. It compares monochloramine decay rates of processed (0.2 microm filtered or microbiologically inhibited by adding 100 microg of silver/L as silver nitrate) and unprocessed samples under controlled temperature conditions. The term microbial decay factor (Fm) was defined and derived from this method, to characterize the relative contribution of microbiologically assisted monochloramine decay to the total monochloramine decay observed in bulk water. Fm is the ratio between microbiologically assisted monochloramine decay and chemical decay of a given water sample measured at 20 degrees C. One possible use of the method is illustrated, where a service reservoir's bulk and inlet waters were sampled twice and analyzed for both the traditional indicators and the microbial decay factor. The microbial decay factor values alone indicated that more microbiologically assisted monochloramine decay was occurring in one bulk water than the other. In contrast, traditional nitrification indicators failed to show any difference. Further analysis showed that the microbial decay factor is more sensitive and that it alone can provide an early warning.
Assessing Integrated Pest Management Adoption: Measurement Problems and Policy Implications
NASA Astrophysics Data System (ADS)
Puente, Molly; Darnall, Nicole; Forkner, Rebecca E.
2011-11-01
For more than a decade, the U.S. government has promoted integrated pest management (IPM) to advance sustainable agriculture. However, the usefulness of this practice has been questioned because of lagging implementation. There are at least two plausible rationales for the slow implementation: (1) growers are not adopting IPM—for whatever reason—and (2) current assessment methods are inadequate at assessing IPM implementation. Our research addresses the second plausibility. We suggest that the traditional approach to measuring IPM implementation on its own fails to assess the distinct, biologically hierarchical components of IPM, and instead aggregates growers' management practices into an overall adoption score. Knowledge of these distinct components and the extent to which they are implemented can inform government officials as to how they should develop targeted assistance programs to encourage broader IPM use. We address these concerns by assessing the components of IPM adoption and comparing our method to the traditional approach alone. Our results indicate that there are four distinct components of adoption—weed, insect, general, and ecosystem management—and that growers implement the first two components significantly more often than the latter two. These findings suggest that using a more nuanced measure to assess IPM adoption that expands on the traditional approach, allows for a better understanding of the degree of IPM implementation.
NASA Astrophysics Data System (ADS)
Dobbs, Vicki
Significant numbers of students fail high school chemistry, preventing them from graduating. Starting in the 2013-2014 school year, 100% of the students must pass a science assessment for schools to meet Adequate Yearly Progress (AYP) in accordance to No Child Left Behind (NCLB). Failure to meet AYP results in sanctions, such as state management or closure of a school or replacing a school staff. The purpose of this study was to determine whether the teaching strategy, Problem Based Learning (PBL), will improve student achievement in high school chemistry to a greater degree than traditional teaching methods. PBL is a student-centered, inquiry-based teaching method based on the constructivist learning theory. The research question looked at whether there was a difference in student achievement between students a high school chemistry classroom using PBL and students in a classroom using traditional teaching methods as measured by scores on a 20-question quiz. The research study used a quasi-experimental pretest/posttest control group design. An independent samples t-test compared gains scores between the pretest and posttest. Analysis of quiz scores indicated that there was not a significant difference (t(171) = 1.001, p = .318) in student achievement between the teaching methods. Because there was not a significant difference, each teacher can decide which teaching method best suites the subject matter and the learning styles of the students. This study adds research based data to help teachers and schools choose one teaching method over another so that students may gain knowledge, develop problem-solving skills, and life-long learning skills that will bring about social change in the form of a higher quality of life for the students and community as a whole.
Viruses - from pathogens to vaccine carriers.
Small, Juliana C; Ertl, Hildegund C J
2011-10-01
Vaccination is mankind's greatest public health success story. By now vaccines to many of the viruses that once caused fatal childhood diseases are routinely used throughout the world. Traditional methods of vaccine development through inactivation or attenuation of viruses have failed for some of the most deadly human pathogens, necessitating new approaches. Genetic modification of viruses not only allows for their attenuation but also for incorporation of sequences from other viruses, turning one pathogen into a vaccine carrier for another. Recombinant viruses have pros and cons as vaccine carriers, as discussed below using vectors based on adenovirus, herpesvirus, flavivirus, and rhabdovirus as examples.
Behavior Based Social Dimensions Extraction for Multi-Label Classification
Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin
2016-01-01
Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes’ behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes’ connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions. PMID:27049849
Tracking rural-to-urban migration in China: Lessons from the 2005 inter-census population survey.
Ebenstein, Avraham; Zhao, Yaohui
2015-01-01
We examined migration in China using the 2005 inter-census population survey, in which migrants were registered at both their place of original (hukou) residence and at their destination. We find evidence that the estimated number of internal migrants in China is extremely sensitive to the enumeration method. We estimate that the traditional destination-based survey method fails to account for more than a third of migrants found using comparable origin-based methods. The 'missing' migrants are disproportionately young, male, and holders of rural hukou. We find that origin-based methods are more effective at capturing migrants who travel short distances for short periods, whereas destination-based methods are more effective when entire households have migrated and no remaining family members are located at the hukou location. We conclude with a set of policy recommendations for the design of population surveys in countries with large migrant populations.
The Western Apache home: landscape management and failing ecosystems
Seth Pilsk; Jeanette C. Cassa
2005-01-01
The traditional Western Apache home lies largely within the Madrean Archipelago. The natural resources of the region make up the basis of the Apache home and culture. Profound landscape changes in the region have occurred over the past 150 years. A survey of traditional Western Apache place names documents many of these changes. An analysis of the history and Apache...
The Work of Play: Meaning-Making in Videogames. New Literacies and Digital Epistemologies. Volume 48
ERIC Educational Resources Information Center
Hung, Aaron Chia Yuan
2011-01-01
Some educational researchers claim that videogames can energize learning in both traditional and non-traditional contexts; cultivate skills more useful to a changing economy; and present information in ways more appealing to students. The notion of "serious games" dates back as early as the 1950s, but so far has failed to make a significant…
Hepel, Jaroslaw T; Heron, Dwight E; Mundt, Arno J; Yashar, Catheryn; Feigenberg, Steven; Koltis, Gordon; Regine, William F; Prasad, Dheerendra; Patel, Shilpen; Sharma, Navesh; Hebert, Mary; Wallis, Norman; Kuettel, Michael
2017-05-01
Accreditation based on peer review of professional standards of care is essential in ensuring quality and safety in administration of radiation therapy. Traditionally, medical chart reviews have been performed by a physical onsite visit. The American College of Radiation Oncology Accreditation Program has remodeled its process whereby electronic charts are reviewed remotely. Twenty-eight radiation oncology practices undergoing accreditation had three charts per practice undergo both onsite and online review. Onsite review was performed by a single reviewer for each practice. Online review consisted of one or more disease site-specific reviewers for each practice. Onsite and online reviews were blinded and scored on a 100-point scale on the basis of 20 categories. A score of less than 75 was failing, and a score of 75 to 79 was marginal. Any failed charts underwent rereview by a disease site team leader. Eighty-four charts underwent both onsite and online review. The mean scores were 86.0 and 86.9 points for charts reviewed onsite and online, respectively. Comparison of onsite and online reviews revealed no statistical difference in chart scores ( P = .43). Of charts reviewed, 21% had a marginal (n = 8) or failing (n = 10) score. There was no difference in failing charts ( P = .48) or combined marginal and failing charts ( P = .13) comparing onsite and online reviews. The American College of Radiation Oncology accreditation process of online chart review results in comparable review scores and rate of failing scores compared with traditional on-site review. However, the modern online process holds less potential for bias by using multiple reviewers per practice and allows for greater oversight via disease site team leader rereview.
NASA Astrophysics Data System (ADS)
Gaci, Said; Hachay, Olga; Zaourar, Naima
2017-04-01
One of the key elements in hydrocarbon reservoirs characterization is the S-wave velocity (Vs). Since the traditional estimating methods often fail to accurately predict this physical parameter, a new approach that takes into account its non-stationary and non-linear properties is needed. In this view, a prediction model based on complete ensemble empirical mode decomposition (CEEMD) and a multiple layer perceptron artificial neural network (MLP ANN) is suggested to compute Vs from P-wave velocity (Vp). Using a fine-to-coarse reconstruction algorithm based on CEEMD, the Vp log data is decomposed into a high frequency (HF) component, a low frequency (LF) component and a trend component. Then, different combinations of these components are used as inputs of the MLP ANN algorithm for estimating Vs log. Applications on well logs taken from different geological settings illustrate that the predicted Vs values using MLP ANN with the combinations of HF, LF and trend in inputs are more accurate than those obtained with the traditional estimating methods. Keywords: S-wave velocity, CEEMD, multilayer perceptron neural networks.
Two betweenness centrality measures based on Randomized Shortest Paths
Kivimäki, Ilkka; Lebichot, Bertrand; Saramäki, Jari; Saerens, Marco
2016-01-01
This paper introduces two new closely related betweenness centrality measures based on the Randomized Shortest Paths (RSP) framework, which fill a gap between traditional network centrality measures based on shortest paths and more recent methods considering random walks or current flows. The framework defines Boltzmann probability distributions over paths of the network which focus on the shortest paths, but also take into account longer paths depending on an inverse temperature parameter. RSP’s have previously proven to be useful in defining distance measures on networks. In this work we study their utility in quantifying the importance of the nodes of a network. The proposed RSP betweenness centralities combine, in an optimal way, the ideas of using the shortest and purely random paths for analysing the roles of network nodes, avoiding issues involving these two paradigms. We present the derivations of these measures and how they can be computed in an efficient way. In addition, we show with real world examples the potential of the RSP betweenness centralities in identifying interesting nodes of a network that more traditional methods might fail to notice. PMID:26838176
Symmetry of extremely floppy molecules: Molecular states beyond rotation-vibration separation
NASA Astrophysics Data System (ADS)
Schmiedt, Hanno; Schlemmer, Stephan; Jensen, Per
2015-10-01
Traditionally, molecules are theoretically described as near-static structures rotating in space. Vibrational motion causing small structural deformations induces a perturbative treatment of the rotation-vibration interaction, which fails in highly fluxional molecules, where all vibrational motions have amplitudes comparable in size to the linear dimensions of the molecule. An example is protonated methane (CH 5+ ) [P. Kumar and D. Marx, Phys. Chem. Chem. Phys. 8, 573 (2006); Z. Jin et al., J. Phys. Chem. A 110, 1569 (2006); and A. S. Petit et al., J. Phys. Chem. A 118, 7206 (2014)]. For these molecules, customary theory fails to simulate reliably even the low-energy spectrum [T. Oka, Science 347, 1313-1314 (2015) and O. Asvany et al., Science 347, 1346-1349 (2015)]. Within the traditional view of rotation and vibration being near-separable, rotational and vibrational wavefunctions can be symmetry classified separately in the molecular symmetry (MS) group [P. Bunker and P. Jensen, Molecular Symmetry and Spectroscopy, NRC Monograph Publishing Program (NRC Research Press, 2006)]. In this article, we discuss a fundamental group theoretical approach to the problem of determining the symmetries of molecular rotation-vibration states. We will show that all MS groups discussed so far are isomorphic to subgroups of the special orthogonal group in three dimensions SO(3). This leads to a group theoretical foundation of the technique of equivalent rotations [H. Longuet-Higgins, Mol. Phys. 6, 445 (1963)]. The group G240 (the MS group of protonated methane) represents, to the best of our knowledge, the first example of a MS group which is not isomorphic to a subgroup of SO(3) (nor of O(3) or of SU(2)). Because of this, a separate symmetry classification of vibrational and rotational wavefunctions becomes impossible in this MS group, consistent with the fact that a decoupling of vibrational and rotational motion is impossible. We discuss here the consequences of this. In conclusion, we show that the prototypical, extremely floppy molecule CH 5+ represents a new class of molecules, where customary group theoretical methods for determining selection rules and spectral assignments fail so that new methods have to be developed.
ERIC Educational Resources Information Center
Carter, Stephanie Power
2006-01-01
The article discusses a multiple literacies and traditional approach to literacy by drawing on the experiences of 2 African American young women in a high school English classroom. The article suggests that teachers who use a more traditional approach to literacy are more apt to view students of color as powerless, failing, struggling, and/or…
Supplemental Instruction Online: As Effective as the Traditional Face-to-Face Model?
NASA Astrophysics Data System (ADS)
Hizer, Suzanne E.; Schultz, P. W.; Bray, Richard
2017-02-01
Supplemental Instruction (SI) is a well-recognized model of academic assistance with a history of empirical evidence demonstrating increases in student grades and decreases in failure rates across many higher education institutions. However, as college students become more accustomed to learning in online venues, what is not known is whether an SI program offered online could benefit students similarly to SI sessions that occur in face-to-face settings. The in-person (traditional) SI program at California State University San Marcos has demonstrated increases in grades and lower fail rates for courses being supported in science and math. Students enrolled in four biology courses who participated in online SI received increases in academic performance similar to the students in the courses who attended traditional SI. Both the online and traditional SI participating students had higher course grades and lower fail rates as compared to students who did not participate in either form of SI. Self-selection, as measured by past cumulative college grade point average, did not differ between students who attended either form of SI or who did not attend. Student perceptions of online SI were generally positive and appeared to offer an alternative path to receive this valuable academic assistance for some students. Overall, results are promising that the highly effective traditional model can be translated to an online environment.
Rank-Optimized Logistic Matrix Regression toward Improved Matrix Data Classification.
Zhang, Jianguang; Jiang, Jianmin
2018-02-01
While existing logistic regression suffers from overfitting and often fails in considering structural information, we propose a novel matrix-based logistic regression to overcome the weakness. In the proposed method, 2D matrices are directly used to learn two groups of parameter vectors along each dimension without vectorization, which allows the proposed method to fully exploit the underlying structural information embedded inside the 2D matrices. Further, we add a joint [Formula: see text]-norm on two parameter matrices, which are organized by aligning each group of parameter vectors in columns. This added co-regularization term has two roles-enhancing the effect of regularization and optimizing the rank during the learning process. With our proposed fast iterative solution, we carried out extensive experiments. The results show that in comparison to both the traditional tensor-based methods and the vector-based regression methods, our proposed solution achieves better performance for matrix data classifications.
Early elective colostomy following spinal cord injury.
Boucher, Michelle
Elective colostomy is an accepted method of bowel management for patients who have had a spinal cord injury (SCI). Approximately 2.4% of patients with SCI have a colostomy, and traditionally it is performed as a last resort several years after injury, and only if bowel complications persist when all other methods have failed. This is despite evidence that patients find a colostomy easier to manage and frequently report wishing it had been performed earlier. It was noticed in the author's spinal unit that increasing numbers of patients were requesting colostomy formation during inpatient rehabilitation following SCI. No supporting literature was found for this; it appears to be an emerging and untested practice. This article explores colostomy formation as a method of bowel management in patients with SCI, considers the optimal time for colostomy formation after injury and examines issues for health professionals.
A Statistical Project Control Tool for Engineering Managers
NASA Technical Reports Server (NTRS)
Bauch, Garland T.
2001-01-01
This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.
Traditions in Spider Monkeys Are Biased towards the Social Domain
Santorelli, Claire J.; Schaffner, Colleen M.; Campbell, Christina J.; Notman, Hugh; Pavelka, Mary S.; Weghorst, Jennifer A.; Aureli, Filippo
2011-01-01
Cross-site comparison studies of behavioral variation can provide evidence for traditions in wild species once ecological and genetic factors are excluded as causes for cross-site differences. These studies ensure behavior variants are considered within the context of a species' ecology and evolutionary adaptations. We examined wide-scale geographic variation in the behavior of spider monkeys (Ateles geoffroyi) across five long-term field sites in Central America using a well established ethnographic cross-site survey method. Spider monkeys possess a relatively rare social system with a high degree of fission-fusion dynamics, also typical of chimpanzees (Pan troglodytes) and humans (Homo sapiens). From the initial 62 behaviors surveyed 65% failed to meet the necessary criteria for traditions. The remaining 22 behaviors showed cross-site variation in occurrence ranging from absent through to customary, representing to our knowledge, the first documented cases of traditions in this taxon and only the second case of multiple traditions in a New World monkey species. Of the 22 behavioral variants recorded across all sites, on average 57% occurred in the social domain, 19% in food-related domains and 24% in other domains. This social bias contrasts with the food-related bias reported in great ape cross-site comparison studies and has implications for the evolution of human culture. No pattern of geographical radiation was found in relation to distance across sites. Our findings promote A. geoffroyi as a model species to investigate traditions with field and captive based experiments and emphasize the importance of the social domain for the study of animal traditions. PMID:21373196
Patterson, Stephen; Balducci, Lodovico; Meyer, Russell
2002-01-01
To establish the role of ancient literature and religious tradition to the modern practice of oncology; foster awareness of practicing in a historical context resulting from different traditions; and propose a spiritual context for the practice of oncology and explore methods to highlight this perspective in cancer education. Contextual and content analysis of a religious text shared by the most common religious traditions of the West (Christianity, Judaism, and Islam). The origin of suffering eludes all logical explanations. All religious traditions affirm that the sufferer should be heard, cared for, and kept part of the human consortium, and under no circumstances blamed for the disease. In terms of oncology practice this means that the treatment should be negotiated with the patient according to his or her need; that physicians' obligations for care continues after the treatment fails, and that patients' lifestyles or poor compliance should not be blamed for poor outcomes. The Book of Job supports a spiritual perspective in oncology practice, indicating that patient care is a holistic endeavor. This perspective is the key to dealing with common interactive problems, such as adversarial relations between patient and provider in face of death and suffering, and more important, may promote care beyond treatment of the disease.
Artificial bee colony algorithm for constrained possibilistic portfolio optimization problem
NASA Astrophysics Data System (ADS)
Chen, Wei
2015-07-01
In this paper, we discuss the portfolio optimization problem with real-world constraints under the assumption that the returns of risky assets are fuzzy numbers. A new possibilistic mean-semiabsolute deviation model is proposed, in which transaction costs, cardinality and quantity constraints are considered. Due to such constraints the proposed model becomes a mixed integer nonlinear programming problem and traditional optimization methods fail to find the optimal solution efficiently. Thus, a modified artificial bee colony (MABC) algorithm is developed to solve the corresponding optimization problem. Finally, a numerical example is given to illustrate the effectiveness of the proposed model and the corresponding algorithm.
An advancing front Delaunay triangulation algorithm designed for robustness
NASA Technical Reports Server (NTRS)
Mavriplis, D. J.
1992-01-01
A new algorithm is described for generating an unstructured mesh about an arbitrary two-dimensional configuration. Mesh points are generated automatically by the algorithm in a manner which ensures a smooth variation of elements, and the resulting triangulation constitutes the Delaunay triangulation of these points. The algorithm combines the mathematical elegance and efficiency of Delaunay triangulation algorithms with the desirable point placement features, boundary integrity, and robustness traditionally associated with advancing-front-type mesh generation strategies. The method offers increased robustness over previous algorithms in that it cannot fail regardless of the initial boundary point distribution and the prescribed cell size distribution throughout the flow-field.
Non-Traditional Commercial Defense Contractors
2013-11-01
law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB...NUMBER OF PAGES 109 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298... five to seven years, the acquisition environment at the DoD for non-traditional commercial contractors has eroded significantly. There is a
Estimating Mixture of Gaussian Processes by Kernel Smoothing
Huang, Mian; Li, Runze; Wang, Hansheng; Yao, Weixin
2014-01-01
When the functional data are not homogeneous, e.g., there exist multiple classes of functional curves in the dataset, traditional estimation methods may fail. In this paper, we propose a new estimation procedure for the Mixture of Gaussian Processes, to incorporate both functional and inhomogeneous properties of the data. Our method can be viewed as a natural extension of high-dimensional normal mixtures. However, the key difference is that smoothed structures are imposed for both the mean and covariance functions. The model is shown to be identifiable, and can be estimated efficiently by a combination of the ideas from EM algorithm, kernel regression, and functional principal component analysis. Our methodology is empirically justified by Monte Carlo simulations and illustrated by an analysis of a supermarket dataset. PMID:24976675
Traditional schemes for treatment of psoriatic arthritis.
McHugh, Neil J
2009-08-01
Prior to the availability of biologic agents such as anti-tumor necrosis factor (TNF), traditional treatment schemes for psoriatic arthritis were not extensively evaluated. While it appears that the newer forms of treatment are more effective, conventional agents still need to be scrutinized with similar methodology and will still have a role in those patients with less progressive disease, in combination with biologic agents, and in patients where biologics are not tolerated or have failed.
Weaning from mechanical ventilation: why are we still looking for alternative methods?
Frutos-Vivar, F; Esteban, A
2013-12-01
Most patients who require mechanical ventilation for longer than 24 hours, and who improve the condition leading to the indication of ventilatory support, can be weaned after passing a first spontaneous breathing test. The challenge is to improve the weaning of patients who fail that first test. We have methods that can be referred to as traditional, such as the T-tube, pressure support or synchronized intermittent mandatory ventilation (SIMV). In recent years, however, new applications of usual techniques as noninvasive ventilation, new ventilation methods such as automatic tube compensation (ATC), mandatory minute ventilation (MMV), adaptive support ventilation or automatic weaning systems based on pressure support have been described. Their possible role in weaning from mechanical ventilation among patients with difficult or prolonged weaning remains to be established. Copyright © 2012 Elsevier España, S.L. and SEMICYUC. All rights reserved.
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Evaluation of safety and mobility of two-lane roundabouts : final report.
DOT National Transportation Integrated Search
2017-07-01
When looking at measures of fatal and severe-injury crashes, roundabouts have demonstrated improved safety performance compared to traditional signalized intersections. Despite this, when it comes to less severe crashes, multilane roundabouts fail to...
A model-based spike sorting algorithm for removing correlation artifacts in multi-neuron recordings.
Pillow, Jonathan W; Shlens, Jonathon; Chichilnisky, E J; Simoncelli, Eero P
2013-01-01
We examine the problem of estimating the spike trains of multiple neurons from voltage traces recorded on one or more extracellular electrodes. Traditional spike-sorting methods rely on thresholding or clustering of recorded signals to identify spikes. While these methods can detect a large fraction of the spikes from a recording, they generally fail to identify synchronous or near-synchronous spikes: cases in which multiple spikes overlap. Here we investigate the geometry of failures in traditional sorting algorithms, and document the prevalence of such errors in multi-electrode recordings from primate retina. We then develop a method for multi-neuron spike sorting using a model that explicitly accounts for the superposition of spike waveforms. We model the recorded voltage traces as a linear combination of spike waveforms plus a stochastic background component of correlated Gaussian noise. Combining this measurement model with a Bernoulli prior over binary spike trains yields a posterior distribution for spikes given the recorded data. We introduce a greedy algorithm to maximize this posterior that we call "binary pursuit". The algorithm allows modest variability in spike waveforms and recovers spike times with higher precision than the voltage sampling rate. This method substantially corrects cross-correlation artifacts that arise with conventional methods, and substantially outperforms clustering methods on both real and simulated data. Finally, we develop diagnostic tools that can be used to assess errors in spike sorting in the absence of ground truth.
A Model-Based Spike Sorting Algorithm for Removing Correlation Artifacts in Multi-Neuron Recordings
Chichilnisky, E. J.; Simoncelli, Eero P.
2013-01-01
We examine the problem of estimating the spike trains of multiple neurons from voltage traces recorded on one or more extracellular electrodes. Traditional spike-sorting methods rely on thresholding or clustering of recorded signals to identify spikes. While these methods can detect a large fraction of the spikes from a recording, they generally fail to identify synchronous or near-synchronous spikes: cases in which multiple spikes overlap. Here we investigate the geometry of failures in traditional sorting algorithms, and document the prevalence of such errors in multi-electrode recordings from primate retina. We then develop a method for multi-neuron spike sorting using a model that explicitly accounts for the superposition of spike waveforms. We model the recorded voltage traces as a linear combination of spike waveforms plus a stochastic background component of correlated Gaussian noise. Combining this measurement model with a Bernoulli prior over binary spike trains yields a posterior distribution for spikes given the recorded data. We introduce a greedy algorithm to maximize this posterior that we call “binary pursuit”. The algorithm allows modest variability in spike waveforms and recovers spike times with higher precision than the voltage sampling rate. This method substantially corrects cross-correlation artifacts that arise with conventional methods, and substantially outperforms clustering methods on both real and simulated data. Finally, we develop diagnostic tools that can be used to assess errors in spike sorting in the absence of ground truth. PMID:23671583
Reliability and Validity of 3 Methods of Assessing Orthopedic Resident Skill in Shoulder Surgery.
Bernard, Johnathan A; Dattilo, Jonathan R; Srikumaran, Uma; Zikria, Bashir A; Jain, Amit; LaPorte, Dawn M
Traditional measures for evaluating resident surgical technical skills (e.g., case logs) assess operative volume but not level of surgical proficiency. Our goal was to compare the reliability and validity of 3 tools for measuring surgical skill among orthopedic residents when performing 3 open surgical approaches to the shoulder. A total of 23 residents at different stages of their surgical training were tested for technical skill pertaining to 3 shoulder surgical approaches using the following measures: Objective Structured Assessment of Technical Skills (OSATS) checklists, the Global Rating Scale (GRS), and a final pass/fail assessment determined by 3 upper extremity surgeons. Adverse events were recorded. The Cronbach α coefficient was used to assess reliability of the OSATS checklists and GRS scores. Interrater reliability was calculated with intraclass correlation coefficients. Correlations among OSATS checklist scores, GRS scores, and pass/fail assessment were calculated with Spearman ρ. Validity of OSATS checklists was determined using analysis of variance with postgraduate year (PGY) as a between-subjects factor. Significance was set at p < 0.05 for all tests. Criterion validity was shown between the OSATS checklists and GRS for the 3 open shoulder approaches. Checklist scores showed superior interrater reliability compared with GRS and subjective pass/fail measurements. GRS scores were positively correlated across training years. The incidence of adverse events was significantly higher among PGY-1 and PGY-2 residents compared with more experienced residents. OSATS checklists are a valid and reliable assessment of technical skills across 3 surgical shoulder approaches. However, checklist scores do not measure quality of technique. Documenting adverse events is necessary to assess quality of technique and ultimate pass/fail status. Multiple methods of assessing surgical skill should be considered when evaluating orthopedic resident surgical performance. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Why Interventions to Influence Adolescent Behavior Often Fail but Could Succeed.
Yeager, David S; Dahl, Ronald E; Dweck, Carol S
2018-01-01
We provide a developmental perspective on two related issues: (a) why traditional preventative school-based interventions work reasonably well for children but less so for middle adolescents and (b) why some alternative approaches to interventions show promise for middle adolescents. We propose the hypothesis that traditional interventions fail when they do not align with adolescents' enhanced desire to feel respected and be accorded status; however, interventions that do align with this desire can motivate internalized, positive behavior change. We review examples of promising interventions that (a) directly harness the desire for status and respect, (b) provide adolescents with more respectful treatment from adults, or (c) lessen the negative influence of threats to status and respect. These examples are in the domains of unhealthy snacking, middle school discipline, and high school aggression. Discussion centers on implications for basic developmental science and for improvements to youth policy and practice.
Most Influenza A Virions Fail To Express at Least One Essential Viral Protein
Brooke, Christopher B.; Ince, William L.; Wrammert, Jens; Ahmed, Rafi; Wilson, Patrick C.; Bennink, Jack R.
2013-01-01
Segmentation of the influenza A virus (IAV) genome enables rapid gene reassortment at the cost of complicating the task of assembling the full viral genome. By simultaneously probing for the expression of multiple viral proteins in MDCK cells infected at a low multiplicity with IAV, we observe that the majority of infected cells lack detectable expression of one or more essential viral proteins. Consistent with this observation, up to 90% of IAV-infected cells fail to release infectious progeny, indicating that many IAV virions scored as noninfectious by traditional infectivity assays are capable of single-round infection. This fraction was not significantly affected by target or producer cell type but varied widely between different IAV strains. These data indicate that IAV exists primarily as a swarm of complementation-dependent semi-infectious virions, and thus traditional, propagation-dependent assays of infectivity may drastically misrepresent the true infectious potential of a virus population. PMID:23283949
Studies of learned helplessness in honey bees (Apis mellifera ligustica).
Dinges, Christopher W; Varnon, Christopher A; Cota, Lisa D; Slykerman, Stephen; Abramson, Charles I
2017-04-01
The current study reports 2 experiments investigating learned helplessness in the honey bee (Apis mellifera ligustica). In Experiment 1, we used a traditional escape method but found the bees' activity levels too high to observe changes due to treatment conditions. The bees were not able to learn in this traditional escape procedure; thus, such procedures may be inappropriate to study learned helplessness in honey bees. In Experiment 2, we used an alternative punishment, or passive avoidance, method to investigate learned helplessness. Using a master and yoked design where bees were trained as either master or yoked and tested as either master or yoked, we found that prior training with unavoidable and inescapable shock in the yoked condition interfered with avoidance and escape behavior in the later master condition. Unlike control bees, learned helplessness bees failed to restrict their movement to the safe compartment following inescapable shock. Unlike learned helplessness studies in other animals, no decrease in general activity was observed. Furthermore, we did not observe a "freezing" response to inescapable aversive stimuli-a phenomenon, thus far, consistently observed in learned helplessness tests with other species. The bees, instead, continued to move back and forth between compartments despite punishment in the incorrect compartment. These findings suggest that, although traditional escape methods may not be suitable, honey bees display learned helplessness in passive avoidance procedures. Thus, regardless of behavioral differences from other species, honey bees can be a unique invertebrate model organism for the study of learned helplessness. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Mayer, J. M.; Stead, D.
2017-04-01
With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.
A Scalable and Robust Multi-Agent Approach to Distributed Optimization
NASA Technical Reports Server (NTRS)
Tumer, Kagan
2005-01-01
Modularizing a large optimization problem so that the solutions to the subproblems provide a good overall solution is a challenging problem. In this paper we present a multi-agent approach to this problem based on aligning the agent objectives with the system objectives, obviating the need to impose external mechanisms to achieve collaboration among the agents. This approach naturally addresses scaling and robustness issues by ensuring that the agents do not rely on the reliable operation of other agents We test this approach in the difficult distributed optimization problem of imperfect device subset selection [Challet and Johnson, 2002]. In this problem, there are n devices, each of which has a "distortion", and the task is to find the subset of those n devices that minimizes the average distortion. Our results show that in large systems (1000 agents) the proposed approach provides improvements of over an order of magnitude over both traditional optimization methods and traditional multi-agent methods. Furthermore, the results show that even in extreme cases of agent failures (i.e., half the agents fail midway through the simulation) the system remains coordinated and still outperforms a failure-free and centralized optimization algorithm.
NASA Astrophysics Data System (ADS)
Amrute, Junedh M.; Athanasiou, Lambros S.; Rikhtegar, Farhad; de la Torre Hernández, José M.; Camarero, Tamara García; Edelman, Elazer R.
2018-03-01
Polymeric endovascular implants are the next step in minimally invasive vascular interventions. As an alternative to traditional metallic drug-eluting stents, these often-erodible scaffolds present opportunities and challenges for patients and clinicians. Theoretically, as they resorb and are absorbed over time, they obviate the long-term complications of permanent implants, but in the short-term visualization and therefore positioning is problematic. Polymeric scaffolds can only be fully imaged using optical coherence tomography (OCT) imaging-they are relatively invisible via angiography-and segmentation of polymeric struts in OCT images is performed manually, a laborious and intractable procedure for large datasets. Traditional lumen detection methods using implant struts as boundary limits fail in images with polymeric implants. Therefore, it is necessary to develop an automated method to detect polymeric struts and luminal borders in OCT images; we present such a fully automated algorithm. Accuracy was validated using expert annotations on 1140 OCT images with a positive predictive value of 0.93 for strut detection and an R2 correlation coefficient of 0.94 between detected and expert-annotated lumen areas. The proposed algorithm allows for rapid, accurate, and automated detection of polymeric struts and the luminal border in OCT images.
Back to the future: therapies for idiopathic nephrotic syndrome.
Gibson, Keisha L; Glenn, Dorey; Ferris, Maria E
2015-01-01
Roughly 20-40% of individuals with idiopathic nephrotic syndrome will fail to respond to standard therapies and have a high risk of progression to end stage kidney disease (ESKD). In the last 50 years, no new therapies have been approved specifically for the treatment of these individuals with recalcitrant disease. Recent in vitro, translational, and clinical studies have identified novel targets and pathways that not only expand our understanding of the complex pathophysiology of proteinuric disease but also provide an opportunity to challenge the tradition of relying on histologic classification of nephrotic diseases to make treatment decisions. The traditional methods of directing the care of individuals with nephrotic syndrome by histological classification or deciding second line therapies on the basis of steroid-responsiveness may soon yield customizing therapies based on our expanding understanding of molecular targets. Important non-immunologic mechanisms of widely used immunosuppressive therapies may be just as important in palliating proteinuric disease as proposed immunologic functions. © 2015 S. Karger AG, Basel.
Gene expression profiling of human breast tissue samples using SAGE-Seq.
Wu, Zhenhua Jeremy; Meyer, Clifford A; Choudhury, Sibgat; Shipitsin, Michail; Maruyama, Reo; Bessarabova, Marina; Nikolskaya, Tatiana; Sukumar, Saraswati; Schwartzman, Armin; Liu, Jun S; Polyak, Kornelia; Liu, X Shirley
2010-12-01
We present a powerful application of ultra high-throughput sequencing, SAGE-Seq, for the accurate quantification of normal and neoplastic mammary epithelial cell transcriptomes. We develop data analysis pipelines that allow the mapping of sense and antisense strands of mitochondrial and RefSeq genes, the normalization between libraries, and the identification of differentially expressed genes. We find that the diversity of cancer transcriptomes is significantly higher than that of normal cells. Our analysis indicates that transcript discovery plateaus at 10 million reads/sample, and suggests a minimum desired sequencing depth around five million reads. Comparison of SAGE-Seq and traditional SAGE on normal and cancerous breast tissues reveals higher sensitivity of SAGE-Seq to detect less-abundant genes, including those encoding for known breast cancer-related transcription factors and G protein-coupled receptors (GPCRs). SAGE-Seq is able to identify genes and pathways abnormally activated in breast cancer that traditional SAGE failed to call. SAGE-Seq is a powerful method for the identification of biomarkers and therapeutic targets in human disease.
Castillo-Juárez, Israel; González, Violeta; Jaime-Aguilar, Héctor; Martínez, Gisela; Linares, Edelmira; Bye, Robert; Romero, Irma
2009-03-18
Helicobacter pylori is the major etiological agent of chronic active gastritis and peptic ulcer disease and is linked to gastric carcinoma. Treatment to eradicate the bacteria failed in many cases, mainly due to antibiotic resistance, hence the necessity of developing better therapeutic regimens. Mexico has an enormous unexplored potential of medicinal plants. This work evaluates the in vitro anti-H. pylori activity of 53 plants used in Mexican traditional medicine for gastrointestinal disorders. To test the in vitro antibacterial activity, agar dilution and broth dilution methods were used for aqueous and methanolic extracts, respectively. Aqueous extracts of Artemisia ludoviciana subsp. mexicana, Cuphea aequipetala, Ludwigia repens,and Mentha x piperita (MIC 125 to <250 microg/ml) as well as methanolic extracts of Persea americana, Annona cherimola, Guaiacum coulteri, and Moussonia deppeana (MIC <7.5 to 15.6 microg/ml) showed the highest inhibitory effect. The results contribute to understanding the mode of action of the studied medicinal plants and for detecting plants with high anti-Helicobacter pylori activity.
Mishra, Sharmistha; Sgaier, Sema K.; Thompson, Laura H.; Moses, Stephen; Ramesh, B. M.; Alary, Michel; Wilson, David; Blanchard, James F.
2012-01-01
Background To design HIV prevention programmes, it is critical to understand the temporal and geographic aspects of the local epidemic and to address the key behaviours that drive HIV transmission. Two methods have been developed to appraise HIV epidemics and guide prevention strategies. The numerical proxy method classifies epidemics based on current HIV prevalence thresholds. The Modes of Transmission (MOT) model estimates the distribution of incidence over one year among risk-groups. Both methods focus on the current state of an epidemic and provide short-term metrics which may not capture the epidemiologic drivers. Through a detailed analysis of country and sub-national data, we explore the limitations of the two traditional methods and propose an alternative approach. Methods and Findings We compared outputs of the traditional methods in five countries for which results were published, and applied the numeric and MOT model to India and six districts within India. We discovered three limitations of the current methods for epidemic appraisal: (1) their results failed to identify the key behaviours that drive the epidemic; (2) they were difficult to apply to local epidemics with heterogeneity across district-level administrative units; and (3) the MOT model was highly sensitive to input parameters, many of which required extraction from non-regional sources. We developed an alternative decision-tree framework for HIV epidemic appraisals, based on a qualitative understanding of epidemiologic drivers, and demonstrated its applicability in India. The alternative framework offered a logical algorithm to characterize epidemics; it required minimal but key data. Conclusions Traditional appraisals that utilize the distribution of prevalent and incident HIV infections in the short-term could misguide prevention priorities and potentially impede efforts to halt the trajectory of the HIV epidemic. An approach that characterizes local transmission dynamics provides a potentially more effective tool with which policy makers can design intervention programmes. PMID:22396756
ERIC Educational Resources Information Center
Chaplin, Miriam T.
1979-01-01
Remedial education is contrary to traditional educational principles, for it operates only after a child has failed. Instead failure should be prevented by developmental, individualized instruction. Furthermore, remedial education isolates and emphasizes skill deficits and may lead to teaching for the test. (Author/SJL)
ERIC Educational Resources Information Center
Casey, B. J.; Getz, Sarah; Galvan, Adriana
2008-01-01
Adolescence is a developmental period characterized by suboptimal decisions and actions that give rise to an increased incidence of unintentional injuries and violence, alcohol and drug abuse, unintended pregnancy and sexually transmitted diseases. Traditional neurobiological and cognitive explanations for adolescent behavior have failed to…
Shulruf, Boaz; Turner, Rolf; Poole, Phillippa; Wilkinson, Tim
2013-05-01
The decision to pass or fail a medical student is a 'high stakes' one. The aim of this study is to introduce and demonstrate the feasibility and practicality of a new objective standard-setting method for determining the pass/fail cut-off score from borderline grades. Three methods for setting up pass/fail cut-off scores were compared: the Regression Method, the Borderline Group Method, and the new Objective Borderline Method (OBM). Using Year 5 students' OSCE results from one medical school we established the pass/fail cut-off scores by the abovementioned three methods. The comparison indicated that the pass/fail cut-off scores generated by the OBM were similar to those generated by the more established methods (0.840 ≤ r ≤ 0.998; p < .0001). Based on theoretical and empirical analysis, we suggest that the OBM has advantages over existing methods in that it combines objectivity, realism, robust empirical basis and, no less importantly, is simple to use.
Failure of engineering artifacts: a life cycle approach.
Del Frate, Luca
2013-09-01
Failure is a central notion both in ethics of engineering and in engineering practice. Engineers devote considerable resources to assure their products will not fail and considerable progress has been made in the development of tools and methods for understanding and avoiding failure. Engineering ethics, on the other hand, is concerned with the moral and social aspects related to the causes and consequences of technological failures. But what is meant by failure, and what does it mean that a failure has occurred? The subject of this paper is how engineers use and define this notion. Although a traditional definition of failure can be identified that is shared by a large part of the engineering community, the literature shows that engineers are willing to consider as failures also events and circumstance that are at odds with this traditional definition. These cases violate one or more of three assumptions made by the traditional approach to failure. An alternative approach, inspired by the notion of product life cycle, is proposed which dispenses with these assumptions. Besides being able to address the traditional cases of failure, it can deal successfully with the problematic cases. The adoption of a life cycle perspective allows the introduction of a clearer notion of failure and allows a classification of failure phenomena that takes into account the roles of stakeholders involved in the various stages of a product life cycle.
NASA Astrophysics Data System (ADS)
Jiang, Jiaqi; Gu, Rongbao
2016-04-01
This paper generalizes the method of traditional singular value decomposition entropy by incorporating orders q of Rényi entropy. We analyze the predictive power of the entropy based on trajectory matrix using Shanghai Composite Index and Dow Jones Index data in both static test and dynamic test. In the static test on SCI, results of global granger causality tests all turn out to be significant regardless of orders selected. But this entropy fails to show much predictability in American stock market. In the dynamic test, we find that the predictive power can be significantly improved in SCI by our generalized method but not in DJI. This suggests that noises and errors affect SCI more frequently than DJI. In the end, results obtained using different length of sliding window also corroborate this finding.
Piette, Elizabeth R; Moore, Jason H
2018-01-01
Machine learning methods and conventions are increasingly employed for the analysis of large, complex biomedical data sets, including genome-wide association studies (GWAS). Reproducibility of machine learning analyses of GWAS can be hampered by biological and statistical factors, particularly so for the investigation of non-additive genetic interactions. Application of traditional cross validation to a GWAS data set may result in poor consistency between the training and testing data set splits due to an imbalance of the interaction genotypes relative to the data as a whole. We propose a new cross validation method, proportional instance cross validation (PICV), that preserves the original distribution of an independent variable when splitting the data set into training and testing partitions. We apply PICV to simulated GWAS data with epistatic interactions of varying minor allele frequencies and prevalences and compare performance to that of a traditional cross validation procedure in which individuals are randomly allocated to training and testing partitions. Sensitivity and positive predictive value are significantly improved across all tested scenarios for PICV compared to traditional cross validation. We also apply PICV to GWAS data from a study of primary open-angle glaucoma to investigate a previously-reported interaction, which fails to significantly replicate; PICV however improves the consistency of testing and training results. Application of traditional machine learning procedures to biomedical data may require modifications to better suit intrinsic characteristics of the data, such as the potential for highly imbalanced genotype distributions in the case of epistasis detection. The reproducibility of genetic interaction findings can be improved by considering this variable imbalance in cross validation implementation, such as with PICV. This approach may be extended to problems in other domains in which imbalanced variable distributions are a concern.
Securebox: a multibiopsy sample container for specimen identification and transport.
Palmieri, Beniamino; Sblendorio, Valeriana; Saleh, Farid; Al-Sebeih, Khalid
2008-01-01
To describe an original multicompartment disposable container for tissue surgical specimens or serial biopsy samples (Securebox). The increasing number of pathology samples from a single patient required for an accurate diagnosis led us to design and manufacture a unique container with 4 boxes; in each box 1 or more biopsy samples can be lodged. A magnification lens on a convex segment of the plastic framework allows inspection of macroscopic details of the recovered specimens. We investigated 400 randomly selected cases (compared with 400 controls) who underwent multiple biopsies from January 2006 to January 2007 to evaluate compliance with the new procedure and detect errors resulting from missing some of the multiple specimens or to technical mistakes during the procedure or delivery that might have compromised the final diagnosis. Using our Securebox, the percentage of oatients whose diagnosis failed or could not be reached was O.5% compared to 4% with the traditional method (p = 0.0012). Moreover, the percentage of medical and nursing staff who were satisfied with the Securebox compared to the traditional methodwas 85% vs. 15%, respectively (p < 0.0001). The average number of days spent bto reach a proper diagnosis based on the usage of the Securebox was 3.38 +/- 1.16 SD compared to 6.76 +/- 0.52 SD with the traditional method (p < 0.0001). The compact Securebox makes it safer and easier to introduce the specimens and to ship them to the pathology laboratories, reducing the risk of error.
Pharmaceutical and analytical evaluation of triphalaguggulkalpa tablets
Savarikar, Shreeram S.; Barbhind, Maneesha M.; Halde, Umakant K.; Kulkarni, Alpana P.
2011-01-01
Aim of the Study: Development of standardized, synergistic, safe and effective traditional herbal formulations with robust scientific evidence can offer faster and more economical alternatives for the treatment of disease. The main objective was to develop a method of preparation of guggulkalpa tablets so that the tablets meet the criteria of efficacy, stability, and safety. Materials and Methods: Triphalaguggulkalpa tablet, described in sharangdharsanhita and containing guggul and triphala powder, was used as a model drug. Preliminary experiments on marketed triphalaguggulkalpa tablets exhibited delayed in vitro disintegration that indicated probable delayed in vivo disintegration. The study involved preparation of triphalaguggulkalpa tablets by Ayurvedic text methods and by wet granulation, dry granulation, and direct compression method. The tablets were evaluated for loss on drying, volatile oil content, % solubility, and steroidal content. The tablets were evaluated for performance tests like weight variation, disintegration, and hardness. Results: It was observed that triphalaguggulkalpa tablets, prepared by direct compression method, complied with the hardness and disintegration tests, whereas tablets prepared by Ayurvedic text methods failed. Conclusion: Direct compression is the best method of preparing triphalaguggulkalpa tablets. PMID:21731383
Traditional healers and perceptions of the causes and treatment of cancer.
Nwoga, I A
1994-12-01
The observation that some Nigerian patients use alternative health care services when they perceive that one medical system has failed them provided the impetus for this survey. The purpose of the survey was to understand why some Nigerian patients rely on traditional healers for cure of cancer by exploring the perceptions of Igbo traditional healers from Anambra State of Nigeria about the causes and treatment of cancer. Implications of the different meanings of illness and disease to patients and physicians provided the theoretical framework for understanding the cultural context of the study. Findings have ethical, moral, and cultural implications for professional nursing practice, education, and research.
ERIC Educational Resources Information Center
Shulruf, Boaz; Turner, Rolf; Poole, Phillippa; Wilkinson, Tim
2013-01-01
The decision to pass or fail a medical student is a "high stakes" one. The aim of this study is to introduce and demonstrate the feasibility and practicality of a new objective standard-setting method for determining the pass/fail cut-off score from borderline grades. Three methods for setting up pass/fail cut-off scores were compared: the…
Stop preaching to the converted
NASA Astrophysics Data System (ADS)
Landrum, Asheley R.; Lull, Robert B.
2017-08-01
Traditional moral arguments fail to persuade conservative climate sceptics. Pope Francis' gifting of his climate encyclical to President Trump prior to his leaving the Paris Agreement shows that even a religious leader's persuasive power is constrained by how his message resonates with conservative moral values.
THE FOREMAN PROBLEM IN JAPANESE INDUSTRY.
ERIC Educational Resources Information Center
THURLEY, KEITH
BRITAIN STUDIED SUPERVISORY TRAINING IN JAPAN, IN ORDER TO GAIN INSIGHT INTO ITS OWN TRAINING PROBLEMS. TRADITIONAL SUPERVISION IN JAPANESE INDUSTRY HAD PRODUCED INCAPABLE FOREMEN THROUGH SENIORITY PROMOTION, CAUSED DIFFICULT RELATIONSHIPS BECAUSE OF AUTHORITARIAN ATTITUDES, AND FAILED TO CLARIFY AUTHORITY ROLES. THE GOVERNMENT RECOMMENDED MORE…
Homogenous photocatalytic decontamination of prion infected stainless steel and titanium surfaces.
Berberidou, Chrysanthi; Xanthopoulos, Konstantinos; Paspaltsis, Ioannis; Lourbopoulos, Athanasios; Polyzoidou, Eleni; Sklaviadis, Theodoros; Poulios, Ioannis
2013-01-01
Prions are notorious for their extraordinary resistance to traditional methods of decontamination, rendering their transmission a public health risk. Iatrogenic Creutzfeldt-Jakob disease (iCJD) via contaminated surgical instruments and medical devices has been verified both experimentally and clinically. Standard methods for prion inactivation by sodium hydroxide or sodium hypochlorite have failed, in some cases, to fully remove prion infectivity, while they are often impractical for routine applications. Prion accumulation in peripheral tissues and indications of human-to-human bloodborne prion transmission, highlight the need for novel, efficient, yet user-friendly methods of prion inactivation. Here we show both in vitro and in vivo that homogenous photocatalytic oxidation, mediated by the photo-Fenton reagent, has the potential to inactivate the pathological prion isoform adsorbed on metal substrates. Photocatalytic oxidation with 224 μg mL(-1) Fe (3+), 500 μg mL(-1) h(-1) H 2O 2, UV-A for 480 min lead to 100% survival in golden Syrian hamsters after intracranial implantation of stainless steel wires infected with the 263K prion strain. Interestingly, photocatalytic treatment of 263K infected titanium wires, under the same experimental conditions, prolonged the survival interval significantly, but failed to eliminate infectivity, a result that we correlate with the increased adsorption of PrP(Sc) on titanium, in comparison to stainless steel. Our findings strongly indicate that our, user--and environmentally--friendly protocol can be safely applied to the decontamination of prion infected stainless steel surfaces.
Exploring the quantum speed limit with computer games
NASA Astrophysics Data System (ADS)
Sørensen, Jens Jakob W. H.; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F.
2016-04-01
Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. ‘Gamification’—the application of game elements in a non-game context—is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.
Exploring the quantum speed limit with computer games.
Sørensen, Jens Jakob W H; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F
2016-04-14
Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. 'Gamification'--the application of game elements in a non-game context--is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.
Blakeney, Bryan A; Tambralli, Ajay; Anderson, Joel M; Andukuri, Adinarayana; Lim, Dong-Jin; Dean, Derrick R; Jun, Ho-Wook
2011-02-01
A limiting factor of traditional electrospinning is that the electrospun scaffolds consist entirely of tightly packed nanofiber layers that only provide a superficial porous structure due to the sheet-like assembly process. This unavoidable characteristic hinders cell infiltration and growth throughout the nanofibrous scaffolds. Numerous strategies have been tried to overcome this challenge, including the incorporation of nanoparticles, using larger microfibers, or removing embedded salt or water-soluble fibers to increase porosity. However, these methods still produce sheet-like nanofibrous scaffolds, failing to create a porous three-dimensional scaffold with good structural integrity. Thus, we have developed a three-dimensional cotton ball-like electrospun scaffold that consists of an accumulation of nanofibers in a low density and uncompressed manner. Instead of a traditional flat-plate collector, a grounded spherical dish and an array of needle-like probes were used to create a Focused, Low density, Uncompressed nanoFiber (FLUF) mesh scaffold. Scanning electron microscopy showed that the cotton ball-like scaffold consisted of electrospun nanofibers with a similar diameter but larger pores and less-dense structure compared to the traditional electrospun scaffolds. In addition, laser confocal microscopy demonstrated an open porosity and loosely packed structure throughout the depth of the cotton ball-like scaffold, contrasting the superficially porous and tightly packed structure of the traditional electrospun scaffold. Cells seeded on the cotton ball-like scaffold infiltrated into the scaffold after 7 days of growth, compared to no penetrating growth for the traditional electrospun scaffold. Quantitative analysis showed approximately a 40% higher growth rate for cells on the cotton ball-like scaffold over a 7 day period, possibly due to the increased space for in-growth within the three-dimensional scaffolds. Overall, this method assembles a nanofibrous scaffold that is more advantageous for highly porous interconnectivity and demonstrates great potential for tackling current challenges of electrospun scaffolds. 2010 Elsevier Ltd. All rights reserved.
Failure probability of three designs of zirconia crowns
Ramos, G. Freitas; Monteiro, E. Barbosa Carmona; Bottino, M.A.; Zhang, Y.; de Melo, R. Marques
2015-01-01
Objectives This study utilized a 2-parameter Weibull analysis for evaluation of lifetime of fully or partially porcelain-/glaze-veneered zirconia crowns after fatigue test. Methods Sixty first molars were selected and prepared for full-coverage crowns with three different designs(n = 20): Traditional –crowns with zirconia framework covered with feldspathic porcelain; Modified– crowns partially covered with veneering porcelain; and Monolithic–full-contour zirconia crowns. All specimens were treated with a glaze layer. Specimens were subjected to mechanical cycling (100N, 3Hz) with a piston with hemispherical tip (Ø=6 mm) until the specimens failed or up to 2×106 cycles. Every 500,000 cycles intervals, the fatigue tests were interrupted, and stereomicroscopy (10 X) was used to inspect the specimens for damage. We performed Weibull analysis of interval data to calculate the number of failures in each interval. Results The types and number of failures according to the groups were: cracking (Traditional-13, Modified-6) and chipping (Traditional-4) of the feldspathic porcelain, followed by delamination (Traditional-1) at the veneer/core interface and debonding (Monollithic-2) at the cementation interface. Weibull parameters (beta, scale; and eta, shape), with a two-sided confidence interval of 95%, were: Traditional – 1.25 and 0.9 × 106cycles; Modified– 0.58 and 11.7 × 106 cycles; and Monolithic – 1.05 and 16.5 × 106 cycles. Traditional crowns showed greater susceptibility to fatigue, the Modified group presented higher propensity to early failures, and the Monolithic group showed no susceptibility to fatigue. The Modified and Monolithic groups presented the highest number of crowns with no failures after the fatigue test. Conclusions The three crown designs presented significantly different behaviors under fatigue. The Modified and the Monolithic groups presented less probability to failure after 2×106cycles. PMID:26509988
Code of Federal Regulations, 2012 CFR
2012-01-01
... liability account balances at a failed insured depository institution. 360.8 Section 360.8 Banks and Banking... RECEIVERSHIP RULES § 360.8 Method for determining deposit and other liability account balances at a failed... receivership purposes at a failed insured depository institution. (b) Definitions—(1) The FDIC Cutoff Point...
Code of Federal Regulations, 2014 CFR
2014-01-01
... liability account balances at a failed insured depository institution. 360.8 Section 360.8 Banks and Banking... RECEIVERSHIP RULES § 360.8 Method for determining deposit and other liability account balances at a failed... receivership purposes at a failed insured depository institution. (b) Definitions—(1) The FDIC Cutoff Point...
Fail-safe reactivity compensation method for a nuclear reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nygaard, Erik T.; Angelo, Peter L.; Aase, Scott B.
The present invention relates generally to the field of compensation methods for nuclear reactors and, in particular to a method for fail-safe reactivity compensation in solution-type nuclear reactors. In one embodiment, the fail-safe reactivity compensation method of the present invention augments other control methods for a nuclear reactor. In still another embodiment, the fail-safe reactivity compensation method of the present invention permits one to control a nuclear reaction in a nuclear reactor through a method that does not rely on moving components into or out of a reactor core, nor does the method of the present invention rely on themore » constant repositioning of control rods within a nuclear reactor in order to maintain a critical state.« less
How to construct and implement script concordance tests: insights from a systematic review.
Dory, Valérie; Gagnon, Robert; Vanpee, Dominique; Charlin, Bernard
2012-06-01
Programmes of assessment should measure the various components of clinical competence. Clinical reasoning has been traditionally assessed using written tests and performance-based tests. The script concordance test (SCT) was developed to assess clinical data interpretation skills. A recent review of the literature examined the validity argument concerning the SCT. Our aim was to provide potential users with evidence-based recommendations on how to construct and implement an SCT. A systematic review of relevant databases (MEDLINE, ERIC [Education Resources Information Centre], PsycINFO, the Research and Development Resource Base [RDRB, University of Toronto]) and Google Scholar, medical education journals and conference proceedings was conducted for references in English or French. It was supplemented by ancestry searching and by additional references provided by experts. The search yielded 848 references, of which 80 were analysed. Studies suggest that tests with around 100 items (25-30 cases), of which 25% are discarded after item analysis, should provide reliable scores. Panels with 10-20 members are needed to reach adequate precision in terms of estimated reliability. Panellists' responses can be analysed by checking for moderate variability among responses. Studies of alternative scoring methods are inconclusive, but the traditional scoring method is satisfactory. There is little evidence on how best to determine a pass/fail threshold for high-stakes examinations. Our literature search was broad and included references from medical education journals not indexed in the usual databases, conference abstracts and dissertations. There is good evidence on how to construct and implement an SCT for formative purposes or medium-stakes course evaluations. Further avenues for research include examining the impact of various aspects of SCT construction and implementation on issues such as educational impact, correlations with other assessments, and validity of pass/fail decisions, particularly for high-stakes examinations. © Blackwell Publishing Ltd 2012.
High Resolution Manometry Correlates of Ineffective Esophageal Motility
Xiao, Yinglian; Kahrilas, Peter J.; Kwasny, Mary J.; Roman, Sabine; Lin, Zhiyue; Nicodème, Frédéric; Lu, Chang; Pandolfino, John E.
2013-01-01
Background There are currently no criteria for ineffective esophageal motility (IEM) and ineffective swallow (IES) in High Resolution Manometry (HRM) and Esophageal Pressure Topography (EPT). Our aims were to utilize HRM metrics to define IEM within the Chicago Classification and to determine the distal contractile integral (DCI) threshold for IES. Methods The EPT of 150 patients with either dysphagia or reflux symptoms were reviewed for the breaks >2 cm in the proximal, middle and distal esophagus in the 20 mmHg isobaric contour (IBC). Peristaltic function in EPT was defined by the Chicago Classification, the corresponding conventional line tracing (CLT) were reviewed separately for IEM and IES. Generalized linear mixed models were used to find thresholds for DCI corresponding to traditionally determined IES and failed swallows. An external validation sample was used to confirm these thresholds. Results In terms of swallow subtypes, IES in CLT were a mixture of normal, weak and failed peristalsis in EPT. A DCI of 450mmHg-s-cm was determined to be optimal in predicting IES. In the validation sample, the threshold of 450 mmHg-s-cm showed strong agreement with CLT determination of IES (positive percent agreement 83%, negative percent agreement 90%) Thirty-three among 42 IEM patients in CLT had large peristaltic breaks, small peristaltic breaks or ‘frequent failed peristalsis’ in EPT; 87.2% (34/39) of patients classified as normal in CLT had proximal IBC-breaks in EPT. the patient level diagnostic agreement between CLT and EPT was good (78.6% positive percent agreement, 63.9% negative percent agreement), with negative agreement increasing to 92.0% if proximal breaks were excluded. Conclusions The manometric correlate of IEM in EPT is a mixture of failed swallows and IBC break in the middle/ distal troughs. A DCI value<450 mmHg-s-cm can be utilized to predict IES previously defined in CLT. IEM can be defined by >5 swallows with weak /failed peristalsis or with a DCI <450 mmHg-s-cm. PMID:22929758
Weaving a Formal Methods Education with Problem-Based Learning
NASA Astrophysics Data System (ADS)
Gibson, J. Paul
The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.
The use of the M-Vac® wet-vacuum system as a method for DNA recovery.
Vickar, Toby; Bache, Katherine; Daniel, Barbara; Frascione, Nunzianda
2018-07-01
Collecting sufficient template DNA from a crime scene sample is often challenging, especially with low quantity samples such as touch DNA (tDNA). Traditional DNA collection methods such as double swabbing have limitations, in particular when used on certain substrates which can be found at crime scenes, thus a better collection method is advantageous. Here, the effectiveness of the M-Vac® Wet-Vacuum System is evaluated as a method for DNA recovery on tiles and bricks. It was found that the M-Vac® recovered 75% more DNA than double swabbing on bricks. However, double swabbing collected significantly more DNA than the M-Vac® on tiles. Additionally, it was found that cell-free DNA is lost in the filtration step of M-Vac® collection. In terms of peak height and number of true alleles detected, no significant difference was found between the DNA profiles obtained through M-Vac® collection versus double swabbing of tDNA depositions from 12 volunteers on bricks. The results demonstrate that the M-Vac® has potential for DNA collection from porous surfaces such as bricks, but that alterations to the filter apparatus would be beneficial to increase the amount of genetic material collected for subsequent DNA profiling. These results are anticipated to be a starting point to validate the M-Vac® as a DNA collection device, providing an alternative method when DNA is present on a difficult substrate, or if traditional DNA collection methods have failed. Copyright © 2018 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.
Portfolio: a comprehensive method of assessment for postgraduates in oral and maxillofacial surgery.
Kadagad, Poornima; Kotrashetti, S M
2013-03-01
Post graduate learning and assessment is an important responsibility of an academic oral and maxillofacial surgeon. The current method of assessment for post graduate training include formative evaluation in the form of seminars, case presentations, log books and infrequently conducted end of year theory exams. End of the course theory and practical examination is a summative evaluation which awards the degree to the student based on grades obtained. Oral and maxillofacial surgery is mainly a skill based specialty and deliberate practice enhances skill. But the traditional system of assessment of post graduates emphasizes their performance on the summative exam which fails to evaluate the integral picture of the student throughout the course. Emphasis on competency and holistic growth of the post graduate student during training in recent years has lead to research and evaluation of assessment methods to quantify students' progress during training. Portfolio method of assessment has been proposed as a potentially functional method for post graduate evaluation. It is defined as a collection of papers and other forms of evidence that learning has taken place. It allows the collation and integration of evidence on competence and performance from different sources to gain a comprehensive picture of everyday practice. The benefits of portfolio assessment in health professions education are twofold: it's potential to assess performance and its potential to assess outcomes, such as attitudes and professionalism that are difficult to assess using traditional instruments. This paper is an endeavor for the development of portfolio method of assessment for post graduate student in oral and maxillofacial surgery.
The Degathering of Society: Implications for Technology and Educators.
ERIC Educational Resources Information Center
Martorella, Peter H.
1996-01-01
Aided by developing technologies, society is moving from its traditional "gathering" pattern (collecting individuals for work, recreation, voting, medical care, and shopping) to a "degathering," decentralized pattern. Alternative schooling formats, such as home schooling, are a likely consequence. Educators who fail to restructure their…
ERIC Educational Resources Information Center
Young, Caprice
2012-01-01
Charter public schools serve a variety of roles in education reform: innovation labs, havens from failing traditional schools; and competitors for pubic resources. Education leaders have the opportunity to use high quality charter schooling to innovate not only in developing transformative schools but, more importantly, in creating great public…
How to Reconcile the Multiculturalist and Universalist Approaches to Science Education
ERIC Educational Resources Information Center
Hansson, Sven Ove
2018-01-01
The "multiculturalist" and "universalist" approaches to science education both fail to recognize the strong continuities between modern science and its forerunners in traditional societies. Various fact-finding practices in indigenous cultures exhibit the hallmarks of scientific investigations, such as collectively achieved…
Essig, Todd
2012-11-01
Many today suffer from an imbalance between life and life on the screen. When extreme, such as excessive gaming, clinicians retreat to familiar explanations, such as "Internet addiction." But the addiction concept is of limited value, limiting both research and treatment options. This article discusses an alternative. Pathological overuse is seen as a failed solution in which people become entrapped by technology's promise of delivering that which only life can offer, such as the grand adventure simulated in World of Warcraft. A two-part treatment approach of such "simulation entrapment" is described in which both the original problem and the entrapment are treated, the former by traditional psychodynamic psychotherapy and the later by highlighting differences between the technologically mediated experience and traditional experiences of being bodies together. The case of a college student suffering from pathological shame with excessive gaming as the failed solution is offered as an illustration. © 2012 Wiley Periodicals, Inc.
Rethinking voluntary euthanasia.
Stoyles, Byron J; Costreie, Sorin
2013-12-01
Our goal in this article is to explicate the way, and the extent to which, euthanasia can be voluntary from both the perspective of the patient and the perspective of the health care providers involved in the patient's care. More significantly, we aim to challenge the way in which those engaged in ongoing philosophical debates regarding the morality of euthanasia draw distinctions between voluntary, involuntary, and nonvoluntary euthanasia on the grounds that drawing the distinctions in the traditional manner (1) fails to reflect what is important from the patient's perspective and (2) fails to reflect the significance of health care providers' interests, including their autonomy and integrity.
A Second Opinion: A Case Narrative on Clinical Ethics Mediation.
Weinstein, Michael S
2015-01-01
Contrasting traditional and common forms if ethics consultation with bioethics mediation. I describe the case of a "second opinion" consultation in the care of a patient with advanced cancer for whom treatment was futile. While the initial ethics consultation, performed by a colleague, let to a recommendation that some may deem ethical, the process failed to involve key stakeholders and failed to explore the underlying values and reasons for the opinions voiced by various stakeholders. The process of mediation ultimately led to creative solutions in which all stakeholders could reach consensus on a plan of care.
Embedding dynamical networks into distributed models
NASA Astrophysics Data System (ADS)
Innocenti, Giacomo; Paoletti, Paolo
2015-07-01
Large networks of interacting dynamical systems are well-known for the complex behaviours they are able to display, even when each node features a quite simple dynamics. Despite examples of such networks being widespread both in nature and in technological applications, the interplay between the local and the macroscopic behaviour, through the interconnection topology, is still not completely understood. Moreover, traditional analytical methods for dynamical response analysis fail because of the intrinsically large dimension of the phase space of the network which makes the general problem intractable. Therefore, in this paper we develop an approach aiming to condense all the information in a compact description based on partial differential equations. By focusing on propagative phenomena, rigorous conditions under which the original network dynamical properties can be successfully analysed within the proposed framework are derived as well. A network of Fitzhugh-Nagumo systems is finally used to illustrate the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Gomez, John A.; Henderson, Thomas M.; Scuseria, Gustavo E.
2017-11-01
In electronic structure theory, restricted single-reference coupled cluster (CC) captures weak correlation but fails catastrophically under strong correlation. Spin-projected unrestricted Hartree-Fock (SUHF), on the other hand, misses weak correlation but captures a large portion of strong correlation. The theoretical description of many important processes, e.g. molecular dissociation, requires a method capable of accurately capturing both weak and strong correlation simultaneously, and would likely benefit from a combined CC-SUHF approach. Based on what we have recently learned about SUHF written as particle-hole excitations out of a symmetry-adapted reference determinant, we here propose a heuristic CC doubles model to attenuate the dominant spin collective channel of the quadratic terms in the CC equations. Proof of principle results presented here are encouraging and point to several paths forward for improving the method further.
NASA Astrophysics Data System (ADS)
Wang, G. H.; Wang, H. B.; Fan, W. F.; Liu, Y.; Chen, C.
2018-04-01
In view of the traditional change detection algorithm mainly depends on the spectral information image spot, failed to effectively mining and fusion of multi-image feature detection advantage, the article borrows the ideas of object oriented analysis proposed a multi feature fusion of remote sensing image change detection algorithm. First by the multi-scale segmentation of image objects based; then calculate the various objects of color histogram and linear gradient histogram; utilizes the color distance and edge line feature distance between EMD statistical operator in different periods of the object, using the adaptive weighted method, the color feature distance and edge in a straight line distance of combination is constructed object heterogeneity. Finally, the curvature histogram analysis image spot change detection results. The experimental results show that the method can fully fuse the color and edge line features, thus improving the accuracy of the change detection.
Charles, Eric P; Rivera, Susan M
2009-11-01
Piaget proposed that understanding permanency, understanding occlusion events, and forming mental representations were synonymous; however, accumulating evidence indicates that those concepts are not unified in development. Infants reach for endarkened objects at younger ages than for occluded objects, and infants' looking patterns suggest that they expect occluded objects to reappear at younger ages than they reach for them. We reaffirm the latter finding in 5- to 6-month-olds and find similar responses to faded objects, but we fail to find that pattern in response to endarkened objects. This suggests that looking behavior and reaching behavior are both sensitive to method of disappearance, but with opposite effects. Current cognition-oriented (i.e. representation-oriented) explanations of looking behavior cannot easily accommodate these results; neither can perceptual-preference explanations, nor the traditional ecological reinterpretations of object permanence. A revised ecological hypothesis, invoking affordance learning, suggests how these differences could arise developmentally.
Recent advances in computational structural reliability analysis methods
NASA Astrophysics Data System (ADS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-10-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
Due Diligence Processes for Public Acquisition of Mining-Impacted Landscapes
NASA Astrophysics Data System (ADS)
Martin, E.; Monohan, C.; Keeble-Toll, A. K.
2016-12-01
The acquisition of public land is critical for achieving conservation and habitat goals in rural regions projected to experience continuously high rates of population growth. To ensure that public funds are utilized responsibly in the purchase of conservation easements appropriate due diligence processes must be established that limit landowner liability post-acquisition. Traditional methods of characterizing contamination in regions where legacy mining activities were prevalent may not utilize current scientific knowledge and understanding of contaminant fate, transport and bioavailability, and therefore are likely to have type two error. Agency prescribed assessment methods utilized under CERLA in many cases fail to detect contamination that presents liability issues by failing to require water quality sampling that would reveal offsite transport potential of contaminants posing human health risks, including mercury. Historical analysis can be used to inform judgmental sampling to identify hotspots and contaminants of concern. Land acquisition projects at two historic mine sites in Nevada County, California, the Champion Mine Complex and the Black Swan Preserve have established the necessity of re-thinking due diligence processes for mining-impacted landscapes. These pilot projects demonstrate that pre-acquisition assessment in the Gold Country must include judgmental sampling and evaluation of contaminant transport. Best practices using the current scientific knowledge must be codified by agencies, consultants, and NGOs in order to ensure responsible use of public funds and to safeguard public health.
Recent advances in computational structural reliability analysis methods
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-01-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
Rapid evolution in lekking grouse: Implications for taxonomic definitions
Oyler-McCance, Sara J.; St. John, Judy; Quinn, Thomas W.
2010-01-01
Species and subspecies delineations were traditionally defined by morphological and behavioral traits, as well as by plumage characteristics. Molecular genetic data have more recently been used to assess these classifications and, in many cases, to redefine them. The recent practice of utilizing molecular genetic data to examine taxonomic questions has led some to suggest that molecular genetic methods are more appropriate than traditional methods for addressing taxonomic uncertainty and management units. We compared the North American Tetraoninae—which have been defined using plumage, morphology, and behavior—and considered the effects of redefinition using only neutral molecular genetic data (mitochondrial control region and cytochrome oxidase subunit 1). Using the criterion of reciprocal monophyly, we failed to recognize the five species whose mating system is highly polygynous, with males displaying on leks. In lek-breeding species, sexual selection can act to influence morphological and behavioral traits at a rate much faster than can be tracked genetically. Thus, we suggest that at least for lek-breeding species, it is important to recognize the possibility that morphological and behavioral changes may occur at an accelerated rate compared with the processes that led to reciprocal monophyly of putatively neutral genetic markers. Therefore, it is particularly important to consider the possible disconnect between such lines of evidence when making taxonomic revisions and definitions of management units.
Generalization of the Poincare sphere to process 2D displacement signals
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Lamberti, Luciano
2017-06-01
Traditionally the multiple phase method has been considered as an essential tool for phase information recovery. The in-quadrature phase method that theoretically is an alternative pathway to achieve the same goal failed in actual applications. The authors in a previous paper dealing with 1D signals have shown that properly implemented the in-quadrature method yields phase values with the same accuracy than the multiple phase method. The present paper extends the methodology developed in 1D to 2D. This extension is not a straight forward process and requires the introduction of a number of additional concepts and developments. The concept of monogenic function provides the necessary tools required for the extension process. The monogenic function has a graphic representation through the Poincare sphere familiar in the field of Photoelasticity and through the developments introduced in this paper connected to the analysis of displacement fringe patterns. The paper is illustrated with examples of application that show that multiple phases method and the in-quadrature are two aspects of the same basic theoretical model.
NASA Astrophysics Data System (ADS)
Scheffold, Frank
2014-08-01
To characterize the structural and dynamic properties of soft materials and small particles, information on the relevant mesoscopic length scales is required. Such information is often obtained from traditional static and dynamic light scattering (SLS/DLS) experiments in the single scattering regime. In many dense systems, however, these powerful techniques frequently fail due to strong multiple scattering of light. Here I will discuss some experimental innovations that have emerged over the last decade. New methods such as 3D static and dynamic light scattering (3D LS) as well as diffusing wave spectroscopy (DWS) can cover a much extended range of experimental parameters ranging from dilute polymer solutions, colloidal suspensions to extremely opaque viscoelastic emulsions.
Teaching Conversation in the Second Language Classroom: Problems and Prospects.
ERIC Educational Resources Information Center
Sze, Paul
1995-01-01
Suggests principles and activities for the development of conversational competence in second-language learners. Shows that materials and activities traditionally used in language teaching fail to address the interactional dimension of conversation. Draws on conversational analysis, classroom discourse, and communicative competence to create a…
Reap & Sow: A Mind Consist of Opinions Waiting to Emerge from Rejection
ERIC Educational Resources Information Center
Brown, Angela
2014-01-01
The American education system is a complicated form of traditional biases that restrain personal interest to pursue an equal education. Our children cannot endure the public scrutiny that is failing in educational institutions, but should be replaced with a provision of hope.
Project Ranger Curriculum Guide.
ERIC Educational Resources Information Center
Fox, Carla; And Others
The objective of Project Ranger is to improve school behavior and academic performance of selected, primarily "disruptive," students who are failing in the traditional school program. The Ranger curriculum uses the outdoor environment as the medium for improving student self-concept and relations with peers and adults and for providing…
Establishing a Community of Discourse through Social Norms
ERIC Educational Resources Information Center
Mullins, Sara Brooke
2018-01-01
While researchers, educators, state and national organizations, and policy makers are taking strides to help transform traditional mathematics classrooms into inquiry-based classrooms, they fail to address how to bridge the gap between creating discussions to developing mathematical discourse. One key component for producing inquiry-based…
Modernizing Career and Technical Education Programs
ERIC Educational Resources Information Center
Drage, Karen
2009-01-01
High-quality career and technical education (CTE) programs can launch America's future competitiveness through increased student engagement, the innovative integration of traditional academic courses, and by meeting the needs of both employers and the economy as a whole. American students failing to keep pace with their international counterparts…
Rational Methods for the Selection of Diverse Screening Compounds
Huggins, David J.; Venkitaraman, Ashok R.; Spring, David R.
2016-01-01
Traditionally a pursuit of large pharmaceutical companies, high-throughput screening assays are becoming increasingly common within academic and government laboratories. This shift has been instrumental in enabling projects that have not been commercially viable, such as chemical probe discovery and screening against high risk targets. Once an assay has been prepared and validated, it must be fed with screening compounds. Crafting a successful collection of small molecules for screening poses a significant challenge. An optimized collection will minimize false positives whilst maximizing hit rates of compounds that are amenable to lead generation and optimization. Without due consideration of the relevant protein targets and the downstream screening assays, compound filtering and selection can fail to explore the great extent of chemical diversity and eschew valuable novelty. Herein, we discuss the different factors to be considered and methods that may be employed when assembling a structurally diverse compound screening collection. Rational methods for selecting diverse chemical libraries are essential for their effective use in high-throughput screens. PMID:21261294
Replacement of seam welded hot reheat pipe using narrow groove GTA machine welding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richardson, R.R.; Yanes, J.; Bryant, R.
1995-12-31
Southern California Edison, recognizing a potential safety concern, scrutinized its existing seam welded hot reheat pipe manufactured by the same supplier as that which failed. Alternatives were narrowed to two in dealing with the installed seam welded pipe. The overriding consideration, however, was one of safety. With this in mind, the utility company evaluated replacement of the seam welded hot reheat pipe with seamless pipe or increasing the frequency of its inspection program. Although increased inspection was much costly, pipe replacement was chosen due to potential safety concerns with seam welded pipe even with more frequent inspection. The utility companymore » then proceeded to determine the most effective method to complete this work. Analysis showed machine-made (automatic) gas tungsten arc welds (GTAW) as the method of choice due to cleanliness and superior mechanical properties. In conjunction with this method, the narrow groove (3{degree} bevel) weld joint as opposed to the traditional groove (37 1/2{degree} bevel) was shown to provide significant technical advantages.« less
Bhavnani, Suresh K.; Chen, Tianlong; Ayyaswamy, Archana; Visweswaran, Shyam; Bellala, Gowtham; Rohit, Divekar; Kevin E., Bassler
2017-01-01
A primary goal of precision medicine is to identify patient subgroups based on their characteristics (e.g., comorbidities or genes) with the goal of designing more targeted interventions. While network visualization methods such as Fruchterman-Reingold have been used to successfully identify such patient subgroups in small to medium sized data sets, they often fail to reveal comprehensible visual patterns in large and dense networks despite having significant clustering. We therefore developed an algorithm called ExplodeLayout, which exploits the existence of significant clusters in bipartite networks to automatically “explode” a traditional network layout with the goal of separating overlapping clusters, while at the same time preserving key network topological properties that are critical for the comprehension of patient subgroups. We demonstrate the utility of ExplodeLayout by visualizing a large dataset extracted from Medicare consisting of readmitted hip-fracture patients and their comorbidities, demonstrate its statistically significant improvement over a traditional layout algorithm, and discuss how the resulting network visualization enabled clinicians to infer mechanisms precipitating hospital readmission in specific patient subgroups. PMID:28815099
Accurately modeling Gaussian beam propagation in the context of Monte Carlo techniques
NASA Astrophysics Data System (ADS)
Hokr, Brett H.; Winblad, Aidan; Bixler, Joel N.; Elpers, Gabriel; Zollars, Byron; Scully, Marlan O.; Yakovlev, Vladislav V.; Thomas, Robert J.
2016-03-01
Monte Carlo simulations are widely considered to be the gold standard for studying the propagation of light in turbid media. However, traditional Monte Carlo methods fail to account for diffraction because they treat light as a particle. This results in converging beams focusing to a point instead of a diffraction limited spot, greatly effecting the accuracy of Monte Carlo simulations near the focal plane. Here, we present a technique capable of simulating a focusing beam in accordance to the rules of Gaussian optics, resulting in a diffraction limited focal spot. This technique can be easily implemented into any traditional Monte Carlo simulation allowing existing models to be converted to include accurate focusing geometries with minimal effort. We will present results for a focusing beam in a layered tissue model, demonstrating that for different scenarios the region of highest intensity, thus the greatest heating, can change from the surface to the focus. The ability to simulate accurate focusing geometries will greatly enhance the usefulness of Monte Carlo for countless applications, including studying laser tissue interactions in medical applications and light propagation through turbid media.
NASA Astrophysics Data System (ADS)
Barak, Miri; Harward, Judson; Kocur, George; Lerman, Steven
2007-08-01
Within the framework of MIT's course 1.00: Introduction to Computers and Engineering Problem Solving, this paper describes an innovative project entitled: Studio 1.00 that integrates lectures with in-class demonstrations, active learning sessions, and on-task feedback, through the use of wireless laptop computers. This paper also describes a related evaluation study that investigated the effectiveness of different instructional strategies, comparing traditional teaching with two models of the studio format. Students' learning outcomes, specifically, their final grades and conceptual understanding of computational methods and programming, were examined. Findings indicated that Studio-1.00, in both its extensive- and partial-active learning modes, enhanced students' learning outcomes in Java programming. Comparing to the traditional courses, more students in the studio courses received "A" as their final grade and less failed. Moreover, students who regularly attended the active learning sessions were able to conceptualize programming principles better than their peers. We have also found two weaknesses in the teaching format of Studio-1.00 that can guide future versions of the course.
Analysis of longitudinal seismic response of bridge with magneto-rheological elastomeric bearings
NASA Astrophysics Data System (ADS)
Li, Rui; Li, Xi; Wu, Yueyuan; Chen, Shiwei; Wang, Xiaojie
2016-04-01
As the weakest part in the bridge system, traditional bridge bearing is incapable of isolating the impact load such as earthquake. A magneto-rheological elastomeric bearing (MRB) with adjustable stiffness and damping parameters is designed, tested and modeled. The developed Bouc-Wen model is adopted to represent the constitutive relation and force-displacement behavior of an MRB. Then, the lead rubber bearing (LRB), passive MRB and controllable MRB are modeled by finite element method (FEM). Furthermore, two typical seismic waves are adopted as inputs for the isolation system of bridge seismic response. The experiments are carried out to investigate the different response along the bridge with on-off controlled MRBs. The results show that the isolating performance of MRB is similar to that of traditional LRB, which ensures the fail-safe capability of bridge with MRBs under seismic excitation. In addition, the controllable bridge with MRBs demonstrated the advantage of isolating capacity and energy dissipation, because it restrains the acceleration peak of bridge beam by 33.3%, and the displacement of bearing decrease by 34.1%. The shear force of the pier top is also alleviated.
Recovery of failed solid-state anaerobic digesters.
Yang, Liangcheng; Ge, Xumeng; Li, Yebo
2016-08-01
This study examined the performance of three methods for recovering failed solid-state anaerobic digesters. The 9-L digesters, which were fed with corn stover, failed at a feedstock/inoculum (F/I) ratio of 10 with negligible methane yields. To recover the systems, inoculum was added to bring the F/I ratio to 4. Inoculum was either added to the top of a failed digester, injected into it, or well-mixed with the existing feedstock. Digesters using top-addition and injection methods quickly resumed and achieved peak yields in 10days, while digesters using well-mixed method recovered slowly but showed 50% higher peak yields. Overall, these methods recovered 30-40% methane from failed digesters. The well-mixed method showed the highest methane yield, followed by the injection and top-addition methods. Recovered digesters outperformed digesters had a constant F/I ratio of 4. Slow mass transfer and slow growth of microbes were believed to be the major limiting factors for recovery. Copyright © 2016 Elsevier Ltd. All rights reserved.
2010-04-01
aluminum titanate has evolved from a coefficient of thermal expansion (CTE) lowering additive in traditional nickel/YSZ cermets to an anchoring...provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently...volumetric concentrations well below percolation for traditional cermets . The coarsening of nickel after high temperature thermal treatment poses
NASA Astrophysics Data System (ADS)
Ji, S.; Yuan, X.
2016-06-01
A generic probabilistic model, under fundamental Bayes' rule and Markov assumption, is introduced to integrate the process of mobile platform localization with optical sensors. And based on it, three relative independent solutions, bundle adjustment, Kalman filtering and particle filtering are deduced under different and additional restrictions. We want to prove that first, Kalman filtering, may be a better initial-value supplier for bundle adjustment than traditional relative orientation in irregular strips and networks or failed tie-point extraction. Second, in high noisy conditions, particle filtering can act as a bridge for gap binding when a large number of gross errors fail a Kalman filtering or a bundle adjustment. Third, both filtering methods, which help reduce the error propagation and eliminate gross errors, guarantee a global and static bundle adjustment, who requires the strictest initial values and control conditions. The main innovation is about the integrated processing of stochastic errors and gross errors in sensor observations, and the integration of the three most used solutions, bundle adjustment, Kalman filtering and particle filtering into a generic probabilistic localization model. The tests in noisy and restricted situations are designed and examined to prove them.
Towards a Framework for Evaluating and Comparing Diagnosis Algorithms
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia,David; Kuhn, Lukas; deKleer, Johan; vanGemund, Arjan; Feldman, Alexander
2009-01-01
Diagnostic inference involves the detection of anomalous system behavior and the identification of its cause, possibly down to a failed unit or to a parameter of a failed unit. Traditional approaches to solving this problem include expert/rule-based, model-based, and data-driven methods. Each approach (and various techniques within each approach) use different representations of the knowledge required to perform the diagnosis. The sensor data is expected to be combined with these internal representations to produce the diagnosis result. In spite of the availability of various diagnosis technologies, there have been only minimal efforts to develop a standardized software framework to run, evaluate, and compare different diagnosis technologies on the same system. This paper presents a framework that defines a standardized representation of the system knowledge, the sensor data, and the form of the diagnosis results and provides a run-time architecture that can execute diagnosis algorithms, send sensor data to the algorithms at appropriate time steps from a variety of sources (including the actual physical system), and collect resulting diagnoses. We also define a set of metrics that can be used to evaluate and compare the performance of the algorithms, and provide software to calculate the metrics.
Becoming Chemists through Game-Based Inquiry Learning: The Case of "Legends of Alkhimia"
ERIC Educational Resources Information Center
Chee, Yam San; Tan, Kim Chwee Daniel
2012-01-01
Traditional modes of chemistry education in schools focus on imparting chemistry knowledge to students via instruction. Consequently, students often acquire the mistaken understanding that scientific knowledge comprises a fixed body of "proven" facts. They fail to comprehend that the construction of scientific understanding is a human…
Service-Learning: Critical Traditions and Geographic Pedagogy
ERIC Educational Resources Information Center
Grabbatin, Brian; Fickey, Amanda
2012-01-01
The rise of service-learning in higher education has been critiqued as little more than community service that encourages students to "do good," but fails to generate original scholarship or social change. In this article, we argue that service-learning gives geographers the opportunity to challenge these critiques, by demonstrating the practical…
Educating for Service: Black Studies for Premeds.
ERIC Educational Resources Information Center
Henderson, Algo; Gumas, Natalie
Traditional undergraduate liberal arts courses, required of most premedical and predental students, have failed dismally to motivate doctors and dentists to become concerned with the health problems of the poor, be they black or white. Examination of black studies programs leads the authors to believe that these programs, if planned with the…
A public utility model for managing public land recreation enterprises.
Tom Quinn
2002-01-01
Through review of relevant economic principles and judicial precedent, a case is made that public-land recreation enterprises are analogous to traditionally recognized public utilities. Given the historical concern over the societal value of recreation and associated pricing issues, public-land management policies failing to acknowledge these utility-like...
Pass-Fail Grading: Laying the Foundation for Self-Regulated Learning
ERIC Educational Resources Information Center
White, Casey B.; Fantone, Joseph C.
2010-01-01
Traditionally, medical schools have tended to make assumptions that students will "automatically" engage in self-education effectively after graduation and subsequent training in residency and fellowships. In reality, the majority of medical graduates out in practice feel unprepared for learning on their own. Many medical schools are now adopting…
Abstract Numeric Relations and the Visual Structure of Algebra
ERIC Educational Resources Information Center
Landy, David; Brookes, David; Smout, Ryan
2014-01-01
Formal algebras are among the most powerful and general mechanisms for expressing quantitative relational statements; yet, even university engineering students, who are relatively proficient with algebraic manipulation, struggle with and often fail to correctly deploy basic aspects of algebraic notation (Clement, 1982). In the cognitive tradition,…
Community Organizing as an Education Reform Strategy
ERIC Educational Resources Information Center
Renee, Michelle; McAlister, Sara
2011-01-01
Community organizing for school reform offers an urgently needed alternative to traditional approaches to school change. Many current reforms fail to thrive due to lack of trust, understanding, or cultural relevance to the community being targeted. The high turnover of reformers (superintendents, principals, or outside organizations) in high-need…
ERIC Educational Resources Information Center
Mills, Allan
2010-01-01
The extreme pressures that are generated when water freezes were traditionally demonstrated by sealing a small volume in a massive cast iron "bomb" and then surrounding it with a freezing mixture of ice and salt. This vessel would dramatically fail by brittle fracture, but no quantitative measurement of bursting pressure was available. Calculation…
Literature Review of Residents as Teachers from an Adult Learning Perspective
ERIC Educational Resources Information Center
Blanchard, Rebecca D.; Hinchey, Kevin T.; Bennett, Elisabeth E.
2011-01-01
Academic medical centers represent the intersection of higher education and workforce development. However residents often utilize traditional pedagogical approaches learned from higher education settings that fail to translate with adult learners. The purpose of this study is to synthesize literature on resident teachers from the perspective of…
The Working Poor and the Community College.
ERIC Educational Resources Information Center
Rosow, La Vergne
1994-01-01
Profiles Mike, a middle-aged Comanche who was preliterate when beginning the author's class in English as a Second Language. Although traditional schooling had failed him, Mike learned enough English to become a prolific writer and translator of North American poetry. By raising tuition and standards, California community college system will…
Importing Leaders for School Turnarounds: Lessons and Opportunities
ERIC Educational Resources Information Center
Kowal, Julie; Hassel, Emily Ayscue
2011-01-01
One of the biggest challenges in education today is identifying talented candidates to successfully lead turnarounds of persistently low-achieving schools. Evidence suggests that the traditional principal pool is already stretched to capacity and cannot supply enough leaders to fix failing schools. But potentially thousands of leaders capable of…
Rhetoric, Alchemy and Heuristic Procedures: Some Epistemological Questions.
ERIC Educational Resources Information Center
Edwards, Bruce L., Jr.
Although composition theorists have rejected the "alchemy" of the traditional, impressionistic, unscientific view of composition as product in favor of a process oriented approach, they have as yet failed to reach a consensus about the epistemological bases that undergird this shift. An examination of parallel developments in literary…
ERIC Educational Resources Information Center
Williamson, Ronald; Blackburn, Barbara R.
2009-01-01
Educators know that something needs to change; they analyze data, build a plan, and provide professional development, yet little changes. Often that is because they fail to take into account the culture of their schools. Culture reflects the complex set of values, traditions, assumptions, and patterns of behavior that are present in a school.…
College in the Media: The Relationship between Repeated Exposure and College Expectations
ERIC Educational Resources Information Center
Nuñez, Roland
2018-01-01
Media consumption can influence viewer perceptions and attitudes. Recent research on media's effect on college students has failed to address gender differences. Using Mere Repeated Exposure Theory (traditionally used in marketing research), this study aims to answer three research questions regarding college media consumption and college…
Overload and Boredom: When the Humanities Turn to Noise.
ERIC Educational Resources Information Center
Walter, James A.
This paper argues that the current debate over humanities curricula has failed to articulate a vision for humanities education because it has turned on a liberal/conservative axis. It also contends that educators must stop debating elite versus democratic values or traditional versus contemporary problems in humanities education, and start…
Climate optimized planting windows for cotton in the lower Mississippi Delta region
USDA-ARS?s Scientific Manuscript database
Unique, variable summer climate of the lower Mississippi Delta region poses a critical challenge to cotton producers in deciding when to plant for optimized production. Traditional 2- to 4-year agronomic field trials conducted in this area fail to capture the effects of long-term climate variabiliti...
Implementing a Tactical Approach through Action Research
ERIC Educational Resources Information Center
Gubacs-Collins, Klara
2007-01-01
Background: Influenced by the original observations of Bunker and Thorpe, physical education theorists began to question the effectiveness of a traditional model for teaching games and have increasingly begun to believe that concentrating only on specific motor responses (techniques) fails to take into account the contextual nature of games. Games…
Counter Synthesis: A Critical Approach to Social Movements.
ERIC Educational Resources Information Center
Harral, Harriet Briscoe
Modern society is an ever-changing pattern of pressure and counter pressure. A social movement in conflict with the dominant society is labeled with easily identifiable stereotypes--bearded long hairs, bra burners, terrorists, bleeding hearts, uneducated rednecks. Traditional criticism has failed to say anything meaningful about social movements…
Pieniak, Zuzanna; Verbeke, Wim; Vanhonacker, Filiep; Guerrero, Luis; Hersleth, Margrethe
2009-08-01
This study investigates the association between traditional food consumption and motives for food choice in six European countries. Cross-sectional data were collected through the TRUEFOOD pan-European consumer survey (n = 4828) with samples representative for age, gender and region in Belgium, France, Italy, Norway, Poland and Spain. Importance attached to familiarity with a product is found to be strongly and positively associated with general attitude toward traditional food as well as traditional food consumption. The importance attached to convenience was negatively related to both general attitude toward traditional food and traditional food consumption, while the importance of weight control negatively influenced the general attitude. Natural content of food was positively associated with the attitude toward traditional food and traditional food consumption. The importance of price when purchasing food failed to be significantly related with general attitude and traditional food consumption both for the pooled sample as well as within each country except in Spain. The proposed model contributes to a better understanding of factors shaping the image and influencing the consumption of traditional foods in Europe. General attitude toward traditional foods, familiarity, and importance of food naturalness emerged as drivers for traditional food consumption. Importance attached to convenience and health acted as direct barriers to traditional food consumption, whereas importance of weight control emerged as an indirect barrier through lowering general attitude toward traditional foods.
Pass-fail grading: laying the foundation for self-regulated learning.
White, Casey B; Fantone, Joseph C
2010-10-01
Traditionally, medical schools have tended to make assumptions that students will "automatically" engage in self-education effectively after graduation and subsequent training in residency and fellowships. In reality, the majority of medical graduates out in practice feel unprepared for learning on their own. Many medical schools are now adopting strategies and pedagogies to help students become self-regulating learners. Along with these changes in practices and pedagogy, many schools are eliminating a cornerstone of extrinsic motivation: discriminating grades. To study the effects of the switch from discriminating to pass-fail grading in the second year of medical school, we compared internal and external assessments and evaluations for a second-year class with a discriminating grading scale (Honors, High Pass, Pass, Fail) and for a second-year class with a pass-fail grading scale. Of the measures we compared (MCATs, GPAs, means on second-year examinations, USMLE Step 1 scores, residency placement, in which there were no statistically significant changes), the only statistically significant decreases (lower performance with pass fail) were found in two of the second-year courses. Performance in one other course also improved significantly. Pass-fail grading can meet several important intended outcomes, including "leveling the playing field" for incoming students with different academic backgrounds, reducing competition and fostering collaboration among members of a class, more time for extracurricular interests and personal activities. Pass-fail grading also reduces competition and supports collaboration, and fosters intrinsic motivation, which is key to self-regulated, lifelong learning.
Forecasting VaR and ES of stock index portfolio: A Vine copula method
NASA Astrophysics Data System (ADS)
Zhang, Bangzheng; Wei, Yu; Yu, Jiang; Lai, Xiaodong; Peng, Zhenfeng
2014-12-01
Risk measurement has both theoretical and practical significance in risk management. Using daily sample of 10 international stock indices, firstly this paper models the internal structures among different stock markets with C-Vine, D-Vine and R-Vine copula models. Secondly, the Value-at-Risk (VaR) and Expected Shortfall (ES) of the international stock markets portfolio are forecasted using Monte Carlo method based on the estimated dependence of different Vine copulas. Finally, the accuracy of VaR and ES measurements obtained from different statistical models are evaluated by UC, IND, CC and Posterior analysis. The empirical results show that the VaR forecasts at the quantile levels of 0.9, 0.95, 0.975 and 0.99 with three kinds of Vine copula models are sufficiently accurate. Several traditional methods, such as historical simulation, mean-variance and DCC-GARCH models, fail to pass the CC backtesting. The Vine copula methods can accurately forecast the ES of the portfolio on the base of VaR measurement, and D-Vine copula model is superior to other Vine copulas.
Ethnonursing and the ethnographic approach in nursing.
Molloy, Luke; Walker, Kim; Lakeman, Richard; Skinner, Isabelle
2015-11-01
To present a critical methodological review of the ethnonursing research method. Ethnonursing was developed to underpin the study and practice of transcultural nursing and to promote 'culturally congruent' care. Ethnonursing claims to produce accurate knowledge about cultural groups to guide nursing care. The idea that the nurse researcher can objectively and transparently represent culture still permeates the ethnonursing method and shapes attempts to advance nursing knowledge and improve patient care through transcultural nursing. Relevant literature published between the 19th and 21st centuries. Literature review. Ethnography saw a 'golden age' in the first half of the 20th century, but the foundations of traditional ethnographic knowledge are being increasingly questioned today. The authors argue that ethnonursing has failed to respond to contemporary issues relevant to ethnographic knowledge and that there is a need to refresh the method. This will allow nurse researchers to move beyond hitherto unproblematic notions of objectivity to recognise the intrinsic relationship between the nurse researcher and the researched. A revised ethnonursing research method would enable nurse researchers to create reflexive interpretations of culture that identify and embody their cultural assumptions and prejudices.
A Second Opinion: A Case Narrative on Clinical Ethics Mediation.
Weinstein, Michael S
2015-01-01
Contrasting traditional and common forms of ethics consultation with bioethics mediation, I describe the case of a "second opinion" consultation in the care of a patient with advanced cancer for whom treatment was futile. While the initial ethics consultation, performed by a colleague, led to a recommendation that some may deem ethical, the process failed to involve key stakeholders and failed to explore the underlying values and reasons for the opinions voiced by various stakeholders. The process of mediation ultimately led to creative solutions in which all stakeholders could reach consensus on a plan of care. Copyright 2015 The Journal of Clinical Ethics. All rights reserved.
Metal nanoplates: Smaller is weaker due to failure by elastic instability
NASA Astrophysics Data System (ADS)
Ho, Duc Tam; Kwon, Soon-Yong; Park, Harold S.; Kim, Sung Youb
2017-11-01
Under mechanical loading, crystalline solids deform elastically, and subsequently yield and fail via plastic deformation. Thus crystalline materials experience two mechanical regimes: elasticity and plasticity. Here, we provide numerical and theoretical evidence to show that metal nanoplates exhibit an intermediate mechanical regime that occurs between elasticity and plasticity, which we call the elastic instability regime. The elastic instability regime begins with a decrease in stress, during which the nanoplates fail via global, and not local, deformation mechanisms that are distinctly different from traditional dislocation-mediated plasticity. Because the nanoplates fail via elastic instability, the governing strength criterion is the ideal strength, rather than the yield strength, and as a result, we observe a unique "smaller is weaker" trend. We develop a simple surface-stress-based analytic model to predict the ideal strength of the metal nanoplates, which accurately reproduces the smaller is weaker behavior observed in the atomistic simulations.
A clinically guided approach for improving performance measurement for hypertension.
Steinman, Michael A; Lee, Sei J; Peterson, Carolyn A; Fung, Kathy Z; Goldstein, Mary K
2012-05-01
Performance measures often fail to account for legitimate reasons why patients do not achieve recommended treatment targets. We tested a novel performance measurement system for blood pressure (BP) control that was designed to mimic clinical reasoning. This clinically guided approach focuses on (1) exempting patients for whom tight BP control may not be appropriate or feasible and (2) assessing BP over time. Trained abstractors conducted structured chart reviews of 201 adults with hypertension in 2 VA health care systems. Results were compared with traditional methods of performance measurement. Among 201 veterans, 183 (91%) were male, and the mean age was 71±11 years. Using the clinically guided approach, 61 patients (30%) were exempted from performance measurement. The most common reasons for exemption were inadequate opportunity to manage BP (35 patients, 17%) and the use of 4 or more antihypertensive medications (19 patients, 9%). Among patients eligible for performance measurement, there was little agreement on the presence of controlled versus uncontrolled BP when comparing the most recent BP (the traditional approach) with an integrated assessment of BP control (κ 0.14). After accounting for clinically guided exemptions and methods of BP assessment, only 15 of 72 patients (21%) whose last BP was ≥140/90 mm Hg were classified as problematic by the clinically guided approach. Many patients have legitimate reasons for not achieving tight BP control, and the methods used for BP assessment have marked effects on whether a patient is classified as having adequate or inadequate BP control.
McConnel, M B; Galligan, D T
2004-10-01
Optimization programs are currently used to aid in the selection of bulls to be used in herd breeding programs. While these programs offer a systematic approach to the problem of semen selection, they ignore the impact of volume discounts. Volume discounts are discounts that vary depending on the number of straws purchased. The dynamic nature of volume discounts means that, in order to be adequately accounted for, they must be considered in the optimization routine. Failing to do this creates a missed economic opportunity because the potential benefits of optimally selecting and combining breeding company discount opportunities are not captured. To address these issues, an integer program was created which used binary decision variables to incorporate the effects of quantity discounts into the optimization program. A consistent set of trait criteria was used to select a group of bulls from 3 sample breeding companies. Three different selection programs were used to select the bulls, 2 traditional methods and the integer method. After the discounts were applied using each method, the integer program resulted in the lowest cost portfolio of bulls. A sensitivity analysis showed that the integer program also resulted in a low cost portfolio when the genetic trait goals were changed to be more or less stringent. In the sample application, a net benefit of the new approach over the traditional approaches was a 12.3 to 20.0% savings in semen cost.
Addison, Paul S; Antunes, André; Montgomery, Dean; Borg, Ulf R
2017-08-01
Cerebral blood flow (CBF) is regulated over a range of systemic blood pressures by the cerebral autoregulation (CA) control mechanism. This range lies within the lower and upper limits of autoregulation (LLA, ULA), beyond which blood pressure drives CBF, and CA function is considered impaired. A standard method to determine autoregulation limits noninvasively using NIRS technology is via the COx measure: a moving correlation index between mean arterial pressure and regional oxygen saturation. In the intact region, there should be no correlation between these variables whereas in the impaired region, the correlation index should approximate unity. In practice, however, the data may be noisy and/or the intact region may often exhibit a slightly positive relationship. This positive relationship may render traditional autoregulation limit calculations difficult to perform, resulting in the need for manual interpretation of the data using arbitrary thresholds. Further, the underlying mathematics of the technique are asymmetric in terms of the results produced for impaired and intact regions and are, in fact, not computable for the ideal case within the intact region. In this work, we propose a novel gradient adjustment method (GACOx) to enhance the differences in COx values observed in the intact and impaired regions. Results from a porcine model (N = 8) are used to demonstrate that GACOx is successful in determining LLA values where traditional methods fail. It is shown that the derived GACOx indices exhibit a mean difference between the intact/impaired regions of 1.54 ± 0.26 (mean ± SD), compared to 0.14 ± 0.10 for the traditional COx method. The GACOx effectively polarizes the COx data in order to better differentiate the intact and impaired zones and, in doing so, makes the determination of the LLA and ULA points a simpler and more consistent task. The method lends itself to the automation of the robust determination of autoregulation zone limits.
ERIC Educational Resources Information Center
Kloser, Matthew
2013-01-01
Texts play an integral role in science research and science classrooms yet biology textbooks have traditionally failed to reflect the epistemic elements of the discipline such as justification of claims and visual representations of empirical data. This study investigates high school biology students' reading experiences when engaging more…
Focus on Teacher Salaries: What Teacher Salary Averages Don't Show.
ERIC Educational Resources Information Center
Gaines, Gale
Traditional comparisons of teacher salary averages fail to consider factors beyond pay raises that affect those averages. Salary averages do not show: regional and national variations among states' average salaries; the variation of salaries within an individual state; variations in the cost of living; the highest degree earned by teachers and the…
ERIC Educational Resources Information Center
Tien, Flora F.; Blackburn, Robert T.
1996-01-01
A study explored the relationship between the traditional system of college faculty rank and faculty research productivity from the perspectives of behavioral reinforcement theory and selection function. Six hypotheses were generated and tested, using data from a 1989 national faculty survey. Results failed to support completely either the…
Authority as a Process: Issues of Gender in the Classroom.
ERIC Educational Resources Information Center
Kasik, Dot Radius
Traditionally, it has been off-limits to talk about failures attributable to the ideology of democratic, student-centered classrooms. However, these classrooms fail for a particular kind of student--the male student who defines himself with a quasi-religious, extremely religious or political stance and who demonstrates strident rejection of…
Social Relations in Childhood and Adolescence: The Convoy Model Perspective
ERIC Educational Resources Information Center
Levitt, Mary J.
2005-01-01
Research on the development of social relations has been largely fragmented along role-specific lines and dominated conceptually by attachment theory. The Convoy Model is presented as an alternative to traditional approaches that fail to capture the complexity of social relationships across time and context. Research based on the model converges…
ERIC Educational Resources Information Center
Schure, Alexander
University administrators must not fail to consider the increasingly sophisticated library technology when making administrative and budgetary decisions about college libraries. The declining traditional student enrollment combined with an expansion of continuing education means that the role of the central compus library must be reconsidered. The…
ERIC Educational Resources Information Center
Williams, Stewart
2008-01-01
Recent disasters have been of such scale and complexity that both the common assumptions made about learning from them, and the traditional approaches distinguishing natural from technological disasters (and now terrorism) are thus challenged. Beck's risk thesis likewise signals the need for a paradigmatic change. Despite sociological inflections…
A Grand Bargain on Quality Assurance?
ERIC Educational Resources Information Center
Taylor, Teresa
2017-01-01
The higher education community faces a wave of challenges. Concerns about affordability and costs have led to questions about the value of higher education to students and society. State and federal funding has been cut or has failed to grow as need has ballooned. Pressures to innovate clash with calls for preserving tradition. Values of academic…
The Silent Revolution in Higher Education
ERIC Educational Resources Information Center
London, Herbert; Draper, Mark
2008-01-01
Higher education today fails to exploit the power of emergent educational technology. If it did, the authors contend that "everyone on the planet would already have access to a top-quality college education for pennies a day acquired in less than half the traditional four years." The authors envision a college education that replaces the lecture…
In Search of Meaning: Values in Modern Clinical Behavior Analysis
ERIC Educational Resources Information Center
Plumb, Jennifer C.; Stewart, Ian; Dahl, JoAnne; Lundgren, Tobias
2009-01-01
Skinner described behavior analysis as the field of values and purpose. However, he defined these concepts in terms of a history of reinforcement and failed to specify whether and how human and nonhuman values might differ. Human values have been seen as theoretically central within a number of nonbehavioral traditions in psychology, including…
Scholarly Publishing in the Electronic Age: A Graduate Student's Perspective.
ERIC Educational Resources Information Center
Thompsen, Philip A.
The issue of whether the nature of scholarship is being changed by electronic publishing was made clear to a graduate student when his telecommunication link to the world (his modem) failed. While the newer forms of academic communication offer impressive advantages over traditional publishing, scholars still feel compelled to retain somehow the…
Teachers' Perspectives of Participation in an International Immersion Experience
ERIC Educational Resources Information Center
Dalton, Kelly Mcgrath
2017-01-01
The urgent call to internationalize teacher education in response to the impact globalization presents in our nation's classrooms, also calls for a fundamental shift in how the field of teacher education provides opportunities of professional learning for teachers. Traditional models of teacher education often fail to develop teachers with the…
Washington Has Failed the Workhorses of American Higher Education
ERIC Educational Resources Information Center
Jones, Diane Auer
2009-01-01
Americans depend largely on their community colleges to advance a form of democratic meritocracy in which all people--from dual-enrolled high-school and home-schooled students to traditional 18-year-old students to forty-something career changers, to retirees and octogenarians--have the opportunity to learn, grow, and excel. Yet despite the vital…
Music and Music Education: Theory and Praxis for "Making a Difference"
ERIC Educational Resources Information Center
Regelski, Thomas A.
2005-01-01
The "music appreciation as contemplation" paradigm of traditional aesthetics and music education assumes that music exists to be contemplated for itself. The resulting distantiation of music and music education from life creates a legitimation crisis for music education. Failing to make a noteworthy musical difference for society, a politics of…
Improved Second Derivative Test for Relative Extrema
ERIC Educational Resources Information Center
Wu, Yan
2007-01-01
In this note, a modified Second Derivative Test is introduced for the relative extrema of a single variable function. This improved test overcomes the difficulty of the second derivative vanishing at the critical point, while in contrast the traditional test fails for this case. A proof for this improved Second Derivative Test is presented,…
Orally incoculated Salmonella typhimurium is detected in the lymph nodes and synovial fluid of swine
USDA-ARS?s Scientific Manuscript database
Salmonella is a foodborne pathogen that has been associated with illnesses from the consumption of meat products. Traditional carcass sampling techniques fail to account for contamination via atypical carcass reservoirs such as lymph nodes and synovial fluid that may harbor Salmonella. In this two-p...
Politics, Markets, and America's Schools.
ERIC Educational Resources Information Center
Chubb, John E.; Moe, Terry M.
The effect of institutions on school effectiveness is explored in this book, which argues that school reforms in the United States are destined to fail because of the failure to address the root of the problem, which is found in the institutions of direct democratic control by which schools have traditionally been governed. Methodology involved…
The Effects of Cooperative Learning on Student Achievement in Algebra I
ERIC Educational Resources Information Center
Brandy, Travis D.
2013-01-01
It is a well-documented finding that high school students in schools across the nation, including California, fail to achieve at the proficient level in mathematics, based on standardized test scores. The purpose of this research study was to compare the findings of students taught using traditional instructional methodologies versus cooperative…
Building a Village through Data: A Research-Practice Partnership to Improve Youth Outcomes
ERIC Educational Resources Information Center
Biag, Manuelito
2017-01-01
There is growing recognition that the traditional research paradigm fails to address the needs of school practitioners. As such, more collaborative and participatory approaches are being encouraged. Yet few articles examine the structures, processes, and dynamics of research-practice partnerships. To address this gap, this essay analyzes a…
Vocational Preparation for Women: A Critical Analysis.
ERIC Educational Resources Information Center
Steiger, JoAnn
In this analysis of vocational preparation for women material is presented to substantiate the claim that women are joining the labor force in increasing numbers and their career opportunities are expanding, but that the educational system has failed to respond. Statistical data is cited showing that women have traditionally been employed in just…
End-of-life decisions and the reinvented Rule of Double Effect: a critical analysis.
Lindblad, Anna; Lynöe, Niels; Juth, Niklas
2014-09-01
The Rule of Double Effect (RDE) holds that it may be permissible to harm an individual while acting for the sake of a proportionate good, given that the harm is not an intended means to the good but merely a foreseen side-effect. Although frequently used in medical ethical reasoning, the rule has been repeatedly questioned in the past few decades. However, Daniel Sulmasy, a proponent who has done a lot of work lately defending the RDE, has recently presented a reformulated and more detailed version of the rule. Thanks to its greater precision, this reinvented RDE avoids several problems thought to plague the traditional RDE. Although an improvement compared with the traditional version, we argue that Sulmasy's reinvented RDE will not stand closer scrutiny. Not only has the range of proper applicability narrowed significantly, but, more importantly, Sulmasy fails to establish that there is a morally relevant distinction between intended and foreseen effects. In particular, he fails to establish that there is any distinction that can account for the alleged moral difference between sedation therapy and euthanasia. © 2012 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salvo, J.J.; Ho, S.V.; Shoemaker, S.H.
Remediating soils and groundwater that have been contaminated with chlorinated solvents is a significant challenge for current environmental technology. Soils with a high proportion of fine silts and clays have been especially recalcitrant due to their low permeability. Recently, electrokinetics has shown great promise in gaining access to these contaminated zones that fail to yield with traditional pumping methods. An integrated approach using electrokinetics combined with in situ capture and destruction zones (LASAGNA{sup trademark}) is being developed and field tested by Monsanto, DuPont and GE under the auspices of the EPA`s Remediation Technology Development Forum and with financial support frommore » the Department of Energy. To speed implementation and encourage partnering, royalty-free cross-licensing of the developed technology is available to consortium members for use on their sites.« less
Code of Federal Regulations, 2011 CFR
2011-01-01
... liability account balances at a failed insured depository institution. 360.8 Section 360.8 Banks and Banking... RECEIVERSHIP RULES § 360.8 Method for determining deposit and other liability account balances at a failed... FDIC will use to determine deposit and other liability account balances for insurance coverage and...
Code of Federal Regulations, 2010 CFR
2010-01-01
... liability account balances at a failed insured depository institution. 360.8 Section 360.8 Banks and Banking... RECEIVERSHIP RULES § 360.8 Method for determining deposit and other liability account balances at a failed... FDIC will use to determine deposit and other liability account balances for insurance coverage and...
Darwin: German mystic or French rationalist?
Ghiselin, Michael T
2015-01-01
The notion that Charles Darwin embraced the German Romantic tradition seems plausible, given the early influence of Alexander von Humboldt. But this view fails to do justice to other scientific traditions. Darwin was a protégé of the Englishman John Stevens Henslow and was a follower of the Scott Charles Lyell. He had important debts to French scientists, notably Henri Milne-Edwards, Etienne and Isidore Geoffroy Saint-Hilaire, and Alphonse de Candolle. Many Germans were quite supportive of Darwin, but not all of these were encumbered by idealistic metaphysical baggage. Both Darwin and Anton Dohrn treated science as very much a cosmopolitan enterprise.
Evaluation of wound healing property of Caesalpinia mimosoides Lam.
Bhat, Pradeep Bhaskar; Hegde, Shruti; Upadhya, Vinayak; Hegde, Ganesh R; Habbu, Prasanna V; Mulgund, Gangadhar S
2016-12-04
Caesalpinia mimosoides Lam. is one of the important traditional folk medicinal plants in the treatment of skin diseases and wounds used by healers of Uttara Kannada district of Karnataka state (India). However scientific validation of documented traditional knowledge related to medicinal plants is an important path in current scenario to fulfill the increasing demand of herbal medicine. The study was carried out to evaluate the claimed uses of Caesalpinia mimosoides using antimicrobial, wound healing and antioxidant activities followed by detection of possible active bio-constituents. Extracts prepared by hot percolation method were subjected to preliminary phytochemical analysis followed by antimicrobial activity using MIC assay. In vivo wound healing activity was evaluated by circular excision and linear incision wound models. The extract with significant antimicrobial and wound healing activity was investigated for antioxidant capacity using DPPH, nitric oxide, antilipid peroxidation and total antioxidant activity methods. Total phenolic and flavonoid contents were also determined by Folin-Ciocalteu, Swain and Hillis methods. Possible bio-active constituents were identified by GC-MS technique. RP-UFLC-DAD analysis was carried out to quantify ethyl gallate and gallic acid in the plant extract. Preliminary phytochemical analysis showed positive results for ethanol and aqueous extracts for all the chemical constituents. The ethanol extract proved potent antimicrobial activity against both bacterial and fungal skin pathogens compared to other extracts. The efficacy of topical application of potent ethanol extract and traditionally used aqueous extracts was evidenced by the complete re-epithelization of the epidermal layer with increased percentage of wound contraction in a shorter period. However, aqueous extract failed to perform a consistent effect in the histopathological assessment. Ethanol extract showed effective scavenging activity against DPPH and nitric oxide free radicals with an expressive amount of phenolic and moderate concentration of flavonoid contents. Ethyl gallate and gallic acid were found to be the probable bio-active compounds evidenced by GCMS and RP-UFLC-DAD analysis. The study revealed the significant antimicrobial, wound healing and antioxidant activities of tender parts of C. mimosoides and proved the traditional folklore knowledge. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Zhang, Yifan; Gao, Xunzhang; Peng, Xuan; Ye, Jiaqi; Li, Xiang
2018-05-16
The High Resolution Range Profile (HRRP) recognition has attracted great concern in the field of Radar Automatic Target Recognition (RATR). However, traditional HRRP recognition methods failed to model high dimensional sequential data efficiently and have a poor anti-noise ability. To deal with these problems, a novel stochastic neural network model named Attention-based Recurrent Temporal Restricted Boltzmann Machine (ARTRBM) is proposed in this paper. RTRBM is utilized to extract discriminative features and the attention mechanism is adopted to select major features. RTRBM is efficient to model high dimensional HRRP sequences because it can extract the information of temporal and spatial correlation between adjacent HRRPs. The attention mechanism is used in sequential data recognition tasks including machine translation and relation classification, which makes the model pay more attention to the major features of recognition. Therefore, the combination of RTRBM and the attention mechanism makes our model effective for extracting more internal related features and choose the important parts of the extracted features. Additionally, the model performs well with the noise corrupted HRRP data. Experimental results on the Moving and Stationary Target Acquisition and Recognition (MSTAR) dataset show that our proposed model outperforms other traditional methods, which indicates that ARTRBM extracts, selects, and utilizes the correlation information between adjacent HRRPs effectively and is suitable for high dimensional data or noise corrupted data.
Teschke, Rolf; Larrey, Dominique; Melchart, Dieter; Danan, Gaby
2016-07-19
Background : Traditional Chinese Medicine (TCM) with its focus on herbal use is popular and appreciated worldwide with increased tendency, although its therapeutic efficacy is poorly established for most herbal TCM products. Treatment was perceived as fairly safe but discussions emerged more recently as to whether herb induced liver injury (HILI) from herbal TCM is a major issue; Methods : To analyze clinical and case characteristics of HILI caused by herbal TCM, we undertook a selective literature search in the PubMed database with the search items Traditional Chinese Medicine, TCM, alone and combined with the terms herbal hepatotoxicity or herb induced liver injury; Results : HILI caused by herbal TCM is rare and similarly to drugs can be caused by an unpredictable idiosyncratic or a predictable intrinsic reaction. Clinical features of liver injury from herbal TCM products are variable, and specific diagnostic biomarkers such as microsomal epoxide hydrolase, pyrrole-protein adducts, metabolomics, and microRNAs are available for only a few TCM herbs. The diagnosis is ascertained if alternative causes are validly excluded and causality levels of probable or highly probable are achieved applying the liver specific RUCAM (Roussel Uclaf Causality Assessment Method) as the most commonly used diagnostic tool worldwide. Case evaluation may be confounded by inappropriate or lacking causality assessment, poor herbal product quality, insufficiently documented cases, and failing to exclude alternative causes such as infections by hepatotropic viruses including hepatitis E virus infections; Conclusion : Suspected cases of liver injury from herbal TCM represent major challenges that deserve special clinical and regulatory attention to improve the quality of case evaluations and ascertain patients' safety and benefit.
Teschke, Rolf; Larrey, Dominique; Melchart, Dieter; Danan, Gaby
2016-01-01
Background: Traditional Chinese Medicine (TCM) with its focus on herbal use is popular and appreciated worldwide with increased tendency, although its therapeutic efficacy is poorly established for most herbal TCM products. Treatment was perceived as fairly safe but discussions emerged more recently as to whether herb induced liver injury (HILI) from herbal TCM is a major issue; Methods: To analyze clinical and case characteristics of HILI caused by herbal TCM, we undertook a selective literature search in the PubMed database with the search items Traditional Chinese Medicine, TCM, alone and combined with the terms herbal hepatotoxicity or herb induced liver injury; Results: HILI caused by herbal TCM is rare and similarly to drugs can be caused by an unpredictable idiosyncratic or a predictable intrinsic reaction. Clinical features of liver injury from herbal TCM products are variable, and specific diagnostic biomarkers such as microsomal epoxide hydrolase, pyrrole-protein adducts, metabolomics, and microRNAs are available for only a few TCM herbs. The diagnosis is ascertained if alternative causes are validly excluded and causality levels of probable or highly probable are achieved applying the liver specific RUCAM (Roussel Uclaf Causality Assessment Method) as the most commonly used diagnostic tool worldwide. Case evaluation may be confounded by inappropriate or lacking causality assessment, poor herbal product quality, insufficiently documented cases, and failing to exclude alternative causes such as infections by hepatotropic viruses including hepatitis E virus infections; Conclusion: Suspected cases of liver injury from herbal TCM represent major challenges that deserve special clinical and regulatory attention to improve the quality of case evaluations and ascertain patients’ safety and benefit. PMID:28930128
Interactive Supercomputing’s Star-P Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edelman, Alan; Husbands, Parry; Leibman, Steve
2006-09-19
The thesis of this extended abstract is simple. High productivity comes from high level infrastructures. To measure this, we introduce a methodology that goes beyond the tradition of timing software in serial and tuned parallel modes. We perform a classroom productivity study involving 29 students who have written a homework exercise in a low level language (MPI message passing) and a high level language (Star-P with MATLAB client). Our conclusions indicate what perhaps should be of little surprise: (1) the high level language is always far easier on the students than the low level language. (2) The early versions ofmore » the high level language perform inadequately compared to the tuned low level language, but later versions substantially catch up. Asymptotically, the analogy must hold that message passing is to high level language parallel programming as assembler is to high level environments such as MATLAB, Mathematica, Maple, or even Python. We follow the Kepner method that correctly realizes that traditional speedup numbers without some discussion of the human cost of reaching these numbers can fail to reflect the true human productivity cost of high performance computing. Traditional data compares low level message passing with serial computation. With the benefit of a high level language system in place, in our case Star-P running with MATLAB client, and with the benefit of a large data pool: 29 students, each running the same code ten times on three evolutions of the same platform, we can methodically demonstrate the productivity gains. To date we are not aware of any high level system as extensive and interoperable as Star-P, nor are we aware of an experiment of this kind performed with this volume of data.« less
Hwang, Jessica P.; Roundtree, Aimee K.; Suarez-Almazor, Maria E.
2017-01-01
Objectives We explored attitudes about prevention, screening and treatment of hepatitis B virus (HBV) infection in Chinese, Korean and Vietnamese communities. Methods We use qualitative methods in 12 focus groups (n=113) of adults who self-reported their ethnicity to be Chinese, Korean, or Vietnamese. We use grounded theory (i.e., consensus-building between co-coders about recurring, emerging themes) for analysis. Results Diet, nutrition, fatigue and stress were misidentified as HBV causes. Improving hygiene, diet, exercise, and holistic methods were misidentified as viable HBV prevention methods. Common screening problems included not affording test and not understanding test results. Participants shared reasons for using complementary and alternative medicine—when Western medicine fails or becomes unaffordable. Participants sought information from medical providers and fellow community members, but also from the internet. Conclusions Many of the attitudes and opinions that emerged may deter participation in HBV screening, prevention and treatment, insofar as community members may factor them into healthcare decision-making, choose alternative but ineffective methods of prevention and treatment, and undervalue the benefits of screening. More patient education in both traditional and new media is necessary for clarifying transmission, screening and treatment misunderstandings. PMID:22302653
Deveci, Canan Dura; Demir, Berfu; Sengul, Ozlem; Dilbaz, Berna; Goktolga, Umit
2015-01-01
To evaluate the efficacy of the stair-step protocol using clomiphene citrate (CC) and to assess the uterine and systemic side effects in patients with polycystic ovary syndrome (PCOS). A total of 60 PCOS patients who failed to respond to 50 mg/day for 5 days of CC treatment within the cycle were randomly allocated to the control (traditional protocol) and study (stair-step protocol) groups. In the stair-step protocol,patients were treated with CC 50 mg/day for 5 days and then in nonresponsive patients, the dosage was increased to 100 mg/day for 5 days in the same cycle. Patients who failed the 50 mg/day CC treatment in the previous cycle were stimulated with 100 mg/day CC and were accepted as the control group. Ovulation and pregnancy rates, duration of treatment and uterine and systemic side effects were evaluated. Ovulation and pregnancy rates were similar between the stair-step and the control group (43.3 vs. 33.3 %, respectively) (16.7 vs. 10 %, respectively). The duration of treatment was significantly shorter in stair-step compared to traditional protocol (20.5 ± 2.0 vs. 48.6 ± 2.4 days, respectively). There were no significant differences in the systemic side effects between the groups. Uterine side effects were evaluated with endometrial thickness and uterine artery Doppler ultrasound; no significant differences were observed in stair-step compared to traditional protocol. The stair-step protocol was determined to have a significantly shorter treatment period without any detrimental effect on the ovulation and pregnancy rates.
Why Community Oriented Policing Has Failed and the Rise of Policing through Practical Partnerships
ERIC Educational Resources Information Center
Plummer, Eric S.
2008-01-01
According to the U.S. Department of Justice (www.usdoj.gov), Community Policing is defined as: "The focus on crime and social disorder through the delivery of police services that includes aspects of traditional law enforcement, as well as prevention, problem-solving, community engagement, and partnerships. The community policing model balances…
ERIC Educational Resources Information Center
Gerjets, Peter; Scheiter, Katharina; Schuh, Julia
2008-01-01
Global comparisons of learning from hypertext/hypermedia and traditional presentation formats like text have yet failed to show major advantages concerning the effectiveness of hypermedia learning. Thus, it is proposed in the current paper to evaluate hypermedia environments more specifically with regard to their potential to implement and support…
Tying Academics to Co-Curriculars Can Teach At-Riskers Responsibility
ERIC Educational Resources Information Center
Kozik, Peter L.; Cowles, Richard C.; Sweet, Dale J.
2005-01-01
In this paper, the authors discuss the new academic eligibility policy for at-risk students of Cato-Meridian, a small rural school district in New York State. The traditional eligibility policy that was in place at Cato-Meridian and surrounding school districts required student athletes to pass all but two of their courses. If they failed two…
Family Literacy in Cultural Context: Lessons from Two Case Studies.
ERIC Educational Resources Information Center
Puchner, Laurel D.
A study examined the literature to determine the veracity of the criticism of some educators who say that family literacy programs in the United States fail to take into account important cultural issues when dealing with certain target groups. Issues invoked included the need to take traditional cultural values and practices into account in…
Lifelong Learning and the Pursuit of a Vision for Sustainable Development in Botswana
ERIC Educational Resources Information Center
Maruatona, Tonic
2011-01-01
This paper analyses Botswana's commitment to lifelong learning policy and discusses how it can help the state achieve its vision for sustainable development. First, it argues that while Botswana is renowned for its economic success, it still fails to address positively such traditional challenges as poverty, unemployment and income inequality,…
Implementing a Social Enterprise Intervention with Homeless, Street-Living Youths in Los Angeles
ERIC Educational Resources Information Center
Ferguson, Kristin M.
2007-01-01
Homeless, street-dwelling youths are an at-risk population who often use survival behaviors to meet their basic needs. The traditional outreach approach brings services into the streets, yet does not adequately replace the youths' high-risk behaviors. Similarly, job training programs often fail to address the mental health issues that constitute…
The Assessment of Young Children through the Lens of Universal Design for Learning (UDL)
ERIC Educational Resources Information Center
Dalton, Elizabeth M.; Brand, Susan Trostle
2012-01-01
Early Childhood Education (EDE) describes the education of young children from birth through age 8. EDE reports have concluded that traditional approaches to curriculum, such as those emphasizing drill and practice of isolated, academic skills, are not in line with current knowledge of human learning and neuropsychology. These approaches fail to…
The Unexpected Minority: Handicapped Children in America.
ERIC Educational Resources Information Center
Gliedman, John; Roth, William
The book takes a civil rights perspective to the problems of handicapped children and adults and points out that no other minority group has its social and political oppression so thoroughly masked as the disabled in America. Part I looks at why American society has traditionally failed to view the handicapped as an oppressed social group. Three…
School Counselors' Role in Dropout Prevention and Credit Recovery
ERIC Educational Resources Information Center
Tromski-Klingshirn, Donna; Miura, Yoko
2017-01-01
This article introduces credit recovery (CR) programs to school counseling. Traditionally the school counselors' role in CR has been limited to referring students who are, or who have, failed courses. Based on our own findings from a study of a large Midwest high school (N = 2,000) CR program, we make specific recommendations for school counselors…
ERIC Educational Resources Information Center
Afterschool Alliance, 2009
2009-01-01
Afterschool provides older youth with critical academic supports including credit attainment and recovery opportunities. Many educators are turning to afterschool programs to reach students who fail one or more courses, become disengaged, or want alternatives to the traditional path to graduation. Credit recovery refers to recovering credits that…
ERIC Educational Resources Information Center
Ross, Michael J.
2013-01-01
Science education in the U.S. has failed for over a century to bring the experience of scientific induction to classrooms, from elementary science to undergraduate courses. The achievement of American students on international comparisons of science proficiency is unacceptable, and the disparities between groups underrepresented in STEM and others…
ERIC Educational Resources Information Center
McNae, Rachel
2015-01-01
Traditional approaches to leadership development frequently draw on Eurocentric, patriarchal discourses located in frameworks aligned to adult learning that may not be culturally or contextually relevant, or fail to pay attention to the needs of young women leading within and beyond their school communities. This research engaged an alternative…
Maritime Tactile Education for Urban Secondary Education Students
ERIC Educational Resources Information Center
Sulzer, Arthur Henry, IV
2012-01-01
Urban high-school students' low average level of academic achievement is a national problem. A lack of academic progress is a factor that contributes to students failing to graduate. In response to these urban high school student problems, a growing number of urban charter high schools have opened as an alternative to the traditional public high…
If Life Happened but a Degree Didn't: Examining Factors That Impact Adult Student Persistence
ERIC Educational Resources Information Center
Bergman, Mathew; Gross, Jacob P. K.; Berry, Matt; Shuck, Brad
2014-01-01
Roughly half of all undergraduate students in the United States fail to persist to degree completion (American College Testing [ACT], 2010; Tinto, 1993; U.S. Department of Education, National Center for Education Statistics, 2013). Adult students often have higher levels of attrition than traditional-age students (Justice & Dornan, 2001;…
ERIC Educational Resources Information Center
Schroeder, Connie
2010-01-01
Recognizing that a necessary and significant role change is underway in faculty development, this book calls for centers to merge their traditional responsibilities and services with a leadership role as organizational developers. Failing to define and outline the dimensions and expertise of this new role puts centers at risk of not only…
ERIC Educational Resources Information Center
Cunningham, Robert F.; Rappa, Anthony
2016-01-01
Surveys were used to examine mathematics teachers (15) on their ability to solve similarity problems and on their likely implementation of lesson objectives for teaching similarity. All correctly solved a similarity problem requiring a traditional static perspective, but 7 out of 15 failed to correctly solve a problem that required a more…
A Comparison of Two Flashcard Interventions for Teaching Sight Words to Early Readers
ERIC Educational Resources Information Center
January, Stacy-Ann A.; Lovelace, Mary E.; Foster, Tori E.; Ardoin, Scott P.
2017-01-01
Strategic Incremental Rehearsal (SIR) is a recently developed flashcard intervention that blends Traditional Drill with Incremental Rehearsal (IR) for teaching sight words. The initial study evaluating SIR found it was more effective than IR for teaching sight words to first-grade students. However, that study failed to assess efficiency, which is…
USDA-ARS?s Scientific Manuscript database
Traditional, hypothesis-oriented research approaches have thus far failed to generate sufficient evidence to achieve consensus about the management of children with many endocrine disorders, partly because of the rarity of these disorders and because of regulatory burdens unique to research in child...
How Online Schools Serve and Fail to Serve At-Risk Students
ERIC Educational Resources Information Center
Figueiredo-Brown, Regina
2013-01-01
Purpose: Online schools were initially designed to provide access to diverse courses to advanced and homeschooled students, however, many online schools now market their programs specifically to students whose needs place them at-risk in traditional schools. The capacity of technology to address any of the needs of under-served students is largely…
Search Strategy Instruction: Shifting from Baby Bird Syndrome to Curious Cat Critical Thinking
ERIC Educational Resources Information Center
Cheby, Lisa
2016-01-01
The traditional way of teaching research often lacks actual information-literacy instruction and, thus, fails to teach students how to be independent researchers. Teachers may help students regain curiosity by guiding them to shift their idea of research from a fact-finding and presentation exercise to a process of inquiry that includes gathering…
Trapped in a Local History: Why Did Extramural Fail to Engage in the Era of Engagement?
ERIC Educational Resources Information Center
Duke, Chris
2008-01-01
Extramural liberal adult education (LAE), as conceived in the particular UK tradition, was doomed by its high-minded origins and its privileged status, and contributed little to the new concepts of "éducation permanente," lifelong learning, the knowledge society, the learning society and region, or to the new understandings of university…
Stewart, Derek; Smith, Susan M.; Gallagher, Paul; Cousins, Gráinne
2017-01-01
Antihypertensive medication nonadherence is highly prevalent, leading to uncontrolled blood pressure. Methods that facilitate the targeting and tailoring of adherence interventions in clinical settings are required. Group‐Based Trajectory Modeling (GBTM) is a newer method to evaluate adherence using pharmacy dispensing (refill) data that has advantages over traditional refill adherence metrics (e.g. Proportion of Days Covered) by identifying groups of patients who may benefit from adherence interventions, and identifying patterns of adherence behavior over time that may facilitate tailoring of an adherence intervention. We evaluated adherence to antihypertensive medication in 905 patients over a 12‐month period in a community pharmacy setting using GBTM, identifying three subgroups of adherence patterns: 52.8%, 40.7%, and 6.5% had very high, high, and low adherence, respectively. However, GBTM failed to demonstrate predictive validity with blood pressure at 12 months. Further research on the validity of adherence measures that facilitate interventions in clinical settings is required. PMID:28875569
Bayesian Revision of Residual Detection Power
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2013-01-01
This paper addresses some issues with quality assessment and quality assurance in response surface modeling experiments executed in wind tunnels. The role of data volume on quality assurance for response surface models is reviewed. Specific wind tunnel response surface modeling experiments are considered for which apparent discrepancies exist between fit quality expectations based on implemented quality assurance tactics, and the actual fit quality achieved in those experiments. These discrepancies are resolved by using Bayesian inference to account for certain imperfections in the assessment methodology. Estimates of the fraction of out-of-tolerance model predictions based on traditional frequentist methods are revised to account for uncertainty in the residual assessment process. The number of sites in the design space for which residuals are out of tolerance is seen to exceed the number of sites where the model actually fails to fit the data. A method is presented to estimate how much of the design space in inadequately modeled by low-order polynomial approximations to the true but unknown underlying response function.
Neural-net-based image matching
NASA Astrophysics Data System (ADS)
Jerebko, Anna K.; Barabanov, Nikita E.; Luciv, Vadim R.; Allinson, Nigel M.
2000-04-01
The paper describes a neural-based method for matching spatially distorted image sets. The matching of partially overlapping images is important in many applications-- integrating information from images formed from different spectral ranges, detecting changes in a scene and identifying objects of differing orientations and sizes. Our approach consists of extracting contour features from both images, describing the contour curves as sets of line segments, comparing these sets, determining the corresponding curves and their common reference points, calculating the image-to-image co-ordinate transformation parameters on the basis of the most successful variant of the derived curve relationships. The main steps are performed by custom neural networks. The algorithms describe in this paper have been successfully tested on a large set of images of the same terrain taken in different spectral ranges, at different seasons and rotated by various angles. In general, this experimental verification indicates that the proposed method for image fusion allows the robust detection of similar objects in noisy, distorted scenes where traditional approaches often fail.
Acoustic emission source localization based on distance domain signal representation
NASA Astrophysics Data System (ADS)
Gawronski, M.; Grabowski, K.; Russek, P.; Staszewski, W. J.; Uhl, T.; Packo, P.
2016-04-01
Acoustic emission is a vital non-destructive testing technique and is widely used in industry for damage detection, localisation and characterization. The latter two aspects are particularly challenging, as AE data are typically noisy. What is more, elastic waves generated by an AE event, propagate through a structural path and are significantly distorted. This effect is particularly prominent for thin elastic plates. In these media the dispersion phenomenon results in severe localisation and characterization issues. Traditional Time Difference of Arrival methods for localisation techniques typically fail when signals are highly dispersive. Hence, algorithms capable of dispersion compensation are sought. This paper presents a method based on the Time - Distance Domain Transform for an accurate AE event localisation. The source localisation is found through a minimization problem. The proposed technique focuses on transforming the time signal to the distance domain response, which would be recorded at the source. Only, basic elastic material properties and plate thickness are used in the approach, avoiding arbitrary parameters tuning.
Modeling Hydrological Extremes in the Anthropocene
NASA Astrophysics Data System (ADS)
Di Baldassarre, Giuliano; Martinez, Fabian; Kalantari, Zahra; Viglione, Alberto
2017-04-01
Hydrological studies have investigated human impacts on hydrological extremes, i.e. droughts and floods, while social studies have explored human responses and adaptation to them. Yet, there is still little understanding about the dynamics resulting from two-way feedbacks, i.e. both impacts and responses. Traditional risk assessment methods therefore fail to assess future dynamics, and thus risk reduction strategies built on these methods can lead to unintended consequences in the medium-long term. Here we review the dynamics resulting from the reciprocal links between society and hydrological extremes, and describe initial efforts to model floods and droughts in the Anthropocene. In particular, we first discuss the need for a novel approach to explicitly account for human interactions with both hydrological extremes, and then present a stylized model simulating the reciprocal effects between droughts, foods and reservoir operation rules. Unprecedented opportunities offered by the growing availability of global data and worldwide archives to uncover the mutual shaping of hydrological extremes and society across places and scales are also discussed.
NASA Astrophysics Data System (ADS)
Kourkoumelis, Christine
2014-04-01
It has been noted by various reports that during recent years, there has been an alarming decline in young people's interest for science studies and mathematics. Since it is believed that the traditional teaching methods often fail to foster positive attitudes towards learning science, the European Commission has made intensive efforts to promote science education in schools though new methods based on the inquiry methodology of learning: questions, search and answers. This should be coupled to laboratories and hands-on experience which should be structured and scaffolded in a pedagogically meaningful way. "PATHWAY", "Discover the COSMOS" and "ISE" have been providing the lesson plans and the best practices for teachers and students and "Go-lab" is working towards an integrated set up of on-line labs for large scale use in science education. In the next sections some concrete examples which aim to bring the High Energy Physics (HEP) frontier research to schools will be given.
Surface models for coupled modelling of runoff and sewer flow in urban areas.
Ettrich, N; Steiner, K; Thomas, M; Rothe, R
2005-01-01
Traditional methods fail for the purpose of simulating the complete flow process in urban areas as a consequence of heavy rainfall and as required by the European Standard EN-752 since the bi-directional coupling between sewer and surface is not properly handled. The new methodology, developed in the EUREKA-project RisUrSim, solves this problem by carrying out the runoff on the basis of shallow water equations solved on high-resolution surface grids. Exchange nodes between the sewer and the surface, like inlets and manholes, are located in the computational grid and water leaving the sewer in case of surcharge is further distributed on the surface. Dense topographical information is needed to build a model suitable for hydrodynamic runoff calculations; in urban areas, in addition, many line-shaped elements like houses, curbs, etc. guide the runoff of water and require polygonal input. Airborne data collection methods offer a great chance to economically gather densely sampled input data.
Yuan, Yuan; Lin, Jianzhe; Wang, Qi
2016-12-01
Hyperspectral image (HSI) classification is a crucial issue in remote sensing. Accurate classification benefits a large number of applications such as land use analysis and marine resource utilization. But high data correlation brings difficulty to reliable classification, especially for HSI with abundant spectral information. Furthermore, the traditional methods often fail to well consider the spatial coherency of HSI that also limits the classification performance. To address these inherent obstacles, a novel spectral-spatial classification scheme is proposed in this paper. The proposed method mainly focuses on multitask joint sparse representation (MJSR) and a stepwise Markov random filed framework, which are claimed to be two main contributions in this procedure. First, the MJSR not only reduces the spectral redundancy, but also retains necessary correlation in spectral field during classification. Second, the stepwise optimization further explores the spatial correlation that significantly enhances the classification accuracy and robustness. As far as several universal quality evaluation indexes are concerned, the experimental results on Indian Pines and Pavia University demonstrate the superiority of our method compared with the state-of-the-art competitors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oesterling, Patrick; Scheuermann, Gerik; Teresniak, Sven
During the last decades, electronic textual information has become the world's largest and most important information source available. People have added a variety of daily newspapers, books, scientific and governmental publications, blogs and private messages to this wellspring of endless information and knowledge. Since neither the existing nor the new information can be read in its entirety, computers are used to extract and visualize meaningful or interesting topics and documents from this huge information clutter. In this paper, we extend, improve and combine existing individual approaches into an overall framework that supports topological analysis of high dimensional document point cloudsmore » given by the well-known tf-idf document-term weighting method. We show that traditional distance-based approaches fail in very high dimensional spaces, and we describe an improved two-stage method for topology-based projections from the original high dimensional information space to both two dimensional (2-D) and three dimensional (3-D) visualizations. To show the accuracy and usability of this framework, we compare it to methods introduced recently and apply it to complex document and patent collections.« less
Buckley, Patrick Henry; Belec, John; Levy, Jason
2015-01-01
Great enthusiasm is attached to the emergence of cross-border regions (CBRs) as a new institutional arrangement for dealing with local cross-border environmental resource management and other issues that remain too distant from national capitals and/or too expensive to be addressed in the traditional topocratic manner requiring instead local adhocratic methods. This study briefly discusses the perceived value of CBRs and necessary and sufficient conditions for the successful and sustainable development of such places. Then, assuming that necessary conditions can be met, the study investigates an intriguing hypothesis concerning the catalyzing of sustainable consensus for cross-border resource management based on a game theoretical approach that employs the use of dilemma of common aversion rather than the more traditional dilemma of competing common interests. Using this lens to investigate a series of events on the Pacific northwestern Canadian-American border in a part of the Fraser Lowland, we look for evidence of the emergence of an active and sustainable CBR to address local trans-border resource management issues. Although our micro-level scale fails to conclusively demonstrate such evidence, it does demonstrate the value of using this approach and suggests a number of avenues for further research. PMID:26154660
Wang, Jingang; Gao, Can; Yang, Jie
2014-07-17
Currently available traditional electromagnetic voltage sensors fail to meet the measurement requirements of the smart grid, because of low accuracy in the static and dynamic ranges and the occurrence of ferromagnetic resonance attributed to overvoltage and output short circuit. This work develops a new non-contact high-bandwidth voltage measurement system for power equipment. This system aims at the miniaturization and non-contact measurement of the smart grid. After traditional D-dot voltage probe analysis, an improved method is proposed. For the sensor to work in a self-integrating pattern, the differential input pattern is adopted for circuit design, and grounding is removed. To prove the structure design, circuit component parameters, and insulation characteristics, Ansoft Maxwell software is used for the simulation. Moreover, the new probe was tested on a 10 kV high-voltage test platform for steady-state error and transient behavior. Experimental results ascertain that the root mean square values of measured voltage are precise and that the phase error is small. The D-dot voltage sensor not only meets the requirement of high accuracy but also exhibits satisfactory transient response. This sensor can meet the intelligence, miniaturization, and convenience requirements of the smart grid.
Sublattice parallel replica dynamics.
Martínez, Enrique; Uberuaga, Blas P; Voter, Arthur F
2014-06-01
Exascale computing presents a challenge for the scientific community as new algorithms must be developed to take full advantage of the new computing paradigm. Atomistic simulation methods that offer full fidelity to the underlying potential, i.e., molecular dynamics (MD) and parallel replica dynamics, fail to use the whole machine speedup, leaving a region in time and sample size space that is unattainable with current algorithms. In this paper, we present an extension of the parallel replica dynamics algorithm [A. F. Voter, Phys. Rev. B 57, R13985 (1998)] by combining it with the synchronous sublattice approach of Shim and Amar [ and , Phys. Rev. B 71, 125432 (2005)], thereby exploiting event locality to improve the algorithm scalability. This algorithm is based on a domain decomposition in which events happen independently in different regions in the sample. We develop an analytical expression for the speedup given by this sublattice parallel replica dynamics algorithm and compare it with parallel MD and traditional parallel replica dynamics. We demonstrate how this algorithm, which introduces a slight additional approximation of event locality, enables the study of physical systems unreachable with traditional methodologies and promises to better utilize the resources of current high performance and future exascale computers.
Deep Coupled Integration of CSAC and GNSS for Robust PNT.
Ma, Lin; You, Zheng; Li, Bin; Zhou, Bin; Han, Runqi
2015-09-11
Global navigation satellite systems (GNSS) are the most widely used positioning, navigation, and timing (PNT) technology. However, a GNSS cannot provide effective PNT services in physical blocks, such as in a natural canyon, canyon city, underground, underwater, and indoors. With the development of micro-electromechanical system (MEMS) technology, the chip scale atomic clock (CSAC) gradually matures, and performance is constantly improved. A deep coupled integration of CSAC and GNSS is explored in this thesis to enhance PNT robustness. "Clock coasting" of CSAC provides time synchronized with GNSS and optimizes navigation equations. However, errors of clock coasting increase over time and can be corrected by GNSS time, which is stable but noisy. In this paper, weighted linear optimal estimation algorithm is used for CSAC-aided GNSS, while Kalman filter is used for GNSS-corrected CSAC. Simulations of the model are conducted, and field tests are carried out. Dilution of precision can be improved by integration. Integration is more accurate than traditional GNSS. When only three satellites are visible, the integration still works, whereas the traditional method fails. The deep coupled integration of CSAC and GNSS can improve the accuracy, reliability, and availability of PNT.
Deep Coupled Integration of CSAC and GNSS for Robust PNT
Ma, Lin; You, Zheng; Li, Bin; Zhou, Bin; Han, Runqi
2015-01-01
Global navigation satellite systems (GNSS) are the most widely used positioning, navigation, and timing (PNT) technology. However, a GNSS cannot provide effective PNT services in physical blocks, such as in a natural canyon, canyon city, underground, underwater, and indoors. With the development of micro-electromechanical system (MEMS) technology, the chip scale atomic clock (CSAC) gradually matures, and performance is constantly improved. A deep coupled integration of CSAC and GNSS is explored in this thesis to enhance PNT robustness. “Clock coasting” of CSAC provides time synchronized with GNSS and optimizes navigation equations. However, errors of clock coasting increase over time and can be corrected by GNSS time, which is stable but noisy. In this paper, weighted linear optimal estimation algorithm is used for CSAC-aided GNSS, while Kalman filter is used for GNSS-corrected CSAC. Simulations of the model are conducted, and field tests are carried out. Dilution of precision can be improved by integration. Integration is more accurate than traditional GNSS. When only three satellites are visible, the integration still works, whereas the traditional method fails. The deep coupled integration of CSAC and GNSS can improve the accuracy, reliability, and availability of PNT. PMID:26378542
Code of Federal Regulations, 2011 CFR
2011-04-01
... participated in the assessment. (b) Method B—Uniform Averaging Procedure. A school may use uniform averaging... 25 Indians 1 2011-04-01 2011-04-01 false If a school fails to achieve its annual measurable... Adequate Yearly Progress § 30.116 If a school fails to achieve its annual measurable objectives, what other...
Code of Federal Regulations, 2012 CFR
2012-04-01
... participated in the assessment. (b) Method B—Uniform Averaging Procedure. A school may use uniform averaging... 25 Indians 1 2012-04-01 2011-04-01 true If a school fails to achieve its annual measurable... Adequate Yearly Progress § 30.116 If a school fails to achieve its annual measurable objectives, what other...
Code of Federal Regulations, 2013 CFR
2013-04-01
... participated in the assessment. (b) Method B—Uniform Averaging Procedure. A school may use uniform averaging... 25 Indians 1 2013-04-01 2013-04-01 false If a school fails to achieve its annual measurable... Adequate Yearly Progress § 30.116 If a school fails to achieve its annual measurable objectives, what other...
Code of Federal Regulations, 2010 CFR
2010-04-01
... participated in the assessment. (b) Method B—Uniform Averaging Procedure. A school may use uniform averaging... 25 Indians 1 2010-04-01 2010-04-01 false If a school fails to achieve its annual measurable... Adequate Yearly Progress § 30.116 If a school fails to achieve its annual measurable objectives, what other...
Code of Federal Regulations, 2014 CFR
2014-04-01
... participated in the assessment. (b) Method B—Uniform Averaging Procedure. A school may use uniform averaging... 25 Indians 1 2014-04-01 2014-04-01 false If a school fails to achieve its annual measurable... Adequate Yearly Progress § 30.116 If a school fails to achieve its annual measurable objectives, what other...
Relationships between level of spiritual transformation and medical outcome.
Mainguy, Barbara; Valenti Pickren, Michael; Mehl-Madrona, Lewis
2013-01-01
Culturally defined healers operate in most of the world, and to various degrees, blend traditional healing practices with those of the dominant religion in the region. They practice more or less openly and more or less in conjunction with science-based health professionals. Nonindigenous peoples are seeking out these healers more often, especially for conditions that carry dire prognoses, such as cancer, and usually after science-based medicine has failed. Little is known about the medical outcomes of people who seek Native North American healing, which is thought by its practitioners to work largely through spiritual means. This study explored the narratives produced through interviews and writings of people working with traditional Aboriginal healers in Canada to assess the degree of spiritual transformation and to determine whether a relationship might exist between that transformation and subsequent changes in medical outcome. Before and after participation in traditional healing practices, participants were interviewed within a narrative inquiry framework and also wrote stories about their lives, their experiences of working with traditional healers, and the changes that the interactions produced. The current study used a variety of traditional healers who lived in Alberta, Saskatchewan, and Manitoba. Urban and Rural Reserves of the Canadian Prairie Provinces. One hundred fifty non-Native individuals requested help from Dr Mehl-Madrona in finding traditional Aboriginal healing and spiritual practitioners and agreed to participate in this study of the effects of their work with the healers. The healers used methods derived from their specific cultural traditions, though all commonly used storytelling, These methods included traditional Aboriginal ceremonies and sweat lodge ceremonies, as well as other diagnosing ceremonies, such as the shaking tent among the Ojibway or the yuwipi ceremony of the Dakota, Nakota, and Lakota, and sacred-pipe-related practices. The research team used a combination of grounded theory modified from a critical constructivist point of view and narrative analysis to rate the degree of spiritual transformation experienced. Medical outcome was measured by a 5-point Likert scale and was confirmed with medical practitioners and other family members. A 5-year follow-up revealed that 44 of the reports were assessed as showing profound levels of persistent spiritual transformation, defined as a sudden and powerful improvement in the spiritual dimension of their lives. The level of spiritual transformation achieved through interaction with healers was associated in a doseresponse relationship with subsequent improvement in medical illness in 134 of 155 people (P < .0001). The degree and intensity of spiritual transformation appeared related to the degree of physical and psychological change among people interacting with traditional North American Indigenous healers. Further research is warranted.
A methodology for probabilistic remaining creep life assessment of gas turbine components
NASA Astrophysics Data System (ADS)
Liu, Zhimin
Certain gas turbine components operate in harsh environments and various mechanisms may lead to component failure. It is common practice to use remaining life assessments to help operators schedule maintenance and component replacements. Creep is a major failure mechanisms that affect the remaining life assessment, and the resulting life consumption of a component is highly sensitive to variations in the material stresses and temperatures, which fluctuate significantly due to the changes in real operating conditions. In addition, variations in material properties and geometry will result in changes in creep life consumption rate. The traditional method used for remaining life assessment assumes a set of fixed operating conditions at all times, and it fails to capture the variations in operating conditions. This translates into a significant loss of accuracy and unnecessary high maintenance and replacement cost. A new method that captures these variations described above and improves the prediction accuracy of remaining life is developed. First, a metamodel is built to approximate the relationship between variables (operating conditions, material properties, geometry, etc.) and a creep response. The metamodel is developed using Response Surface Method/Design of Experiments methodology. Design of Experiments is an efficient sampling method, and for each sampling point a set of finite element analyses are used to compute the corresponding response value. Next, a low order polynomial Response Surface Equation (RSE) is used to fit these values. Four techniques are suggested to dramatically reduce computational effort, and to increase the accuracy of the RSE: smart meshing technique, automatic geometry parameterization, screening test and regional RSE refinement. The RSEs, along with a probabilistic method and a life fraction model are used to compute current damage accumulation and remaining life. By capturing the variations mentioned above, the new method results in much better accuracy than that available using the traditional method. After further development and proper verification the method should bring significant savings by reducing the number of inspections and deferring part replacement.
Comparison of 1-step and 2-step methods of fitting microbiological models.
Jewell, Keith
2012-11-15
Previous conclusions that a 1-step fitting method gives more precise coefficients than the traditional 2-step method are confirmed by application to three different data sets. It is also shown that, in comparison to 2-step fits, the 1-step method gives better fits to the data (often substantially) with directly interpretable regression diagnostics and standard errors. The improvement is greatest at extremes of environmental conditions and it is shown that 1-step fits can indicate inappropriate functional forms when 2-step fits do not. 1-step fits are better at estimating primary parameters (e.g. lag, growth rate) as well as concentrations, and are much more data efficient, allowing the construction of more robust models on smaller data sets. The 1-step method can be straightforwardly applied to any data set for which the 2-step method can be used and additionally to some data sets where the 2-step method fails. A 2-step approach is appropriate for visual assessment in the early stages of model development, and may be a convenient way to generate starting values for a 1-step fit, but the 1-step approach should be used for any quantitative assessment. Copyright © 2012 Elsevier B.V. All rights reserved.
Neural net controlled tag gas sampling system for nuclear reactors
Gross, Kenneth C.; Laug, Matthew T.; Lambert, John D. B.; Herzog, James P.
1997-01-01
A method and system for providing a tag gas identifier to a nuclear fuel rod and analyze escaped tag gas to identify a particular failed nuclear fuel rod. The method and system include disposing a unique tag gas composition into a plenum of a nuclear fuel rod, monitoring gamma ray activity, analyzing gamma ray signals to assess whether a nuclear fuel rod has failed and is emitting tag gas, activating a tag gas sampling and analysis system upon sensing tag gas emission from a failed nuclear rod and evaluating the escaped tag gas to identify the particular failed nuclear fuel rod.
The Causal Effects of Father Absence
McLanahan, Sara; Tach, Laura; Schneider, Daniel
2014-01-01
The literature on father absence is frequently criticized for its use of cross-sectional data and methods that fail to take account of possible omitted variable bias and reverse causality. We review studies that have responded to this critique by employing a variety of innovative research designs to identify the causal effect of father absence, including studies using lagged dependent variable models, growth curve models, individual fixed effects models, sibling fixed effects models, natural experiments, and propensity score matching models. Our assessment is that studies using more rigorous designs continue to find negative effects of father absence on offspring well-being, although the magnitude of these effects is smaller than what is found using traditional cross-sectional designs. The evidence is strongest and most consistent for outcomes such as high school graduation, children’s social-emotional adjustment, and adult mental health. PMID:24489431
NASA Astrophysics Data System (ADS)
Wistisen, Michele
There has been limited success teaching elementary students about the phases of the moon using diagrams, personal observations, and manipulatives. One possible reason for this is that instruction has failed to apply Gestalt principles of perceptual organization to the lesson materials. To see if fourth grade students' understanding could be improved, four lessons were designed and taught using the Gestalt laws of Figure-Ground, Symmetry, and Similarity. Students (n = 54) who were taught lessons applying the Gestalt principles scored 12% higher on an assessment than students (n = 51) who only were taught lessons using the traditional methods. Though scores showed significant improvement, it is recommended to follow the American Association for the Advancement of Science guidelines and wait until 9th grade to instruct students about the phases.
eHealth recruitment challenges.
Thompson, Debbe; Canada, Ashanti; Bhatt, Riddhi; Davis, Jennifer; Plesko, Lisa; Baranowski, Tom; Cullen, Karen; Zakeri, Issa
2006-11-01
Little is known about effective eHealth recruitment methods. This paper presents recruitment challenges associated with enrolling African-American girls aged 8-10 years in an eHealth obesity prevention program, their effect on the recruitment plan, and potential implications for eHealth research. Although the initial recruitment strategy was literature-informed, it failed to enroll the desired number of girls within a reasonable time period. Therefore, the recruitment strategy was reformulated to incorporate principles of social marketing and traditional marketing techniques. The resulting plan included both targeted, highly specific strategies (e.g., selected churches), and more broad-based approaches (e.g., media exposure, mass mailings, radio advertisements). The revised plan enabled recruitment goals to be attained. Media appeared to be particularly effective at reaching the intended audience. Future research should identify the most effective recruitment strategies for reaching potential eHealth audiences.
Walking and talking the tree of life: Why and how to teach about biodiversity.
Ballen, Cissy J; Greene, Harry W
2017-03-01
Taxonomic details of diversity are an essential scaffolding for biology education, yet outdated methods for teaching the tree of life (TOL), as implied by textbook content and usage, are still commonly employed. Here, we show that the traditional approach only vaguely represents evolutionary relationships, fails to denote major events in the history of life, and relies heavily on memorizing near-meaningless taxonomic ranks. Conversely, a clade-based strategy-focused on common ancestry, monophyletic groups, and derived functional traits-is explicitly based on Darwin's "descent with modification," provides students with a rational system for organizing the details of biodiversity, and readily lends itself to active learning techniques. We advocate for a phylogenetic classification that mirrors the TOL, a pedagogical format of increasingly complex but always hierarchical presentations, and the adoption of active learning technologies and tactics.
Mabry, John H.
1993-01-01
The strong tradition of “school room” grammars may have had a negative influence on the reception given a functional analysis of verbal behavior, both within and without the field of behavior analysis. Some of the failings of those traditional grammars, and their largely prescriptive nature were outlined through reference to other critics, and conflicting views. Skinner's own treatment of grammatical issues was presented, emphasizing his view of a functional unit and his use of the autoclitic and intraverbal functions to describe alternatives to a formal or structural analysis. Finally, the relevance of stimulus control variables to some recurring questions about verbal behavior and, specifically grammar, were mentioned. PMID:22477082
Stroope, Samuel
2011-01-01
Feeling that you belong in a group is an important and powerful need. The ability to foster a sense of belonging can also determine whether groups survive. Organizational features of groups cultivate feelings of belonging, yet prior research fails to investigate the idea that belief systems also play a major role. Using multilevel data, this study finds that church members' traditional beliefs, group-level belief unity, and their interaction associate positively with members' sense of belonging. In fact, belief unity can be thought of as a “sacred canopy” under which the relationship between traditional beliefs and feelings of belonging thrives.
Likelihood ratio meta-analysis: New motivation and approach for an old method.
Dormuth, Colin R; Filion, Kristian B; Platt, Robert W
2016-03-01
A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the likelihood ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the log-likelihood ratio (LogLR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Jobin, Benoît; Labrecque, Sandra; Grenier, Marcelle; Falardeau, Gilles
2008-01-01
The traditional method of identifying wildlife habitat distribution over large regions consists of pixel-based classification of satellite images into a suite of habitat classes used to select suitable habitat patches. Object-based classification is a new method that can achieve the same objective based on the segmentation of spectral bands of the image creating homogeneous polygons with regard to spatial or spectral characteristics. The segmentation algorithm does not solely rely on the single pixel value, but also on shape, texture, and pixel spatial continuity. The object-based classification is a knowledge base process where an interpretation key is developed using ground control points and objects are assigned to specific classes according to threshold values of determined spectral and/or spatial attributes. We developed a model using the eCognition software to identify suitable habitats for the Grasshopper Sparrow, a rare and declining species found in southwestern Québec. The model was developed in a region with known breeding sites and applied on other images covering adjacent regions where potential breeding habitats may be present. We were successful in locating potential habitats in areas where dairy farming prevailed but failed in an adjacent region covered by a distinct Landsat scene and dominated by annual crops. We discuss the added value of this method, such as the possibility to use the contextual information associated to objects and the ability to eliminate unsuitable areas in the segmentation and land cover classification processes, as well as technical and logistical constraints. A series of recommendations on the use of this method and on conservation issues of Grasshopper Sparrow habitat is also provided.
Hwang, Jessica P; Roundtree, Aimee K; Suarez-Almazor, Maria E
2012-10-01
We explored attitudes about prevention, screening and treatment of hepatitis B virus (HBV) infection in Chinese, Korean and Vietnamese communities. We use qualitative methods in 12 focus groups (n = 113) of adults who self-reported their ethnicity to be Chinese, Korean, or Vietnamese. We use grounded theory (i.e., consensus-building between co-coders about recurring, emerging themes) for analysis. Diet, nutrition, fatigue and stress were misidentified as HBV causes. Improving hygiene, diet, exercise, and holistic methods were misidentified as viable HBV prevention methods. Common screening problems included not affording test and not understanding test results. Participants shared reasons for using complementary and alternative medicine--when Western medicine fails or becomes unaffordable. Participants sought information from medical providers and fellow community members, but also from the internet. Many of the attitudes and opinions that emerged may deter participation in HBV screening, prevention and treatment, insofar as community members may factor them into healthcare decision-making, choose alternative but ineffective methods of prevention and treatment, and undervalue the benefits of screening. More patient education in both traditional and new media is necessary for clarifying transmission, screening and treatment misunderstandings.
NASA Astrophysics Data System (ADS)
Patel, Ravi; Kong, Bo; Capecelatro, Jesse; Fox, Rodney; Desjardins, Olivier
2017-11-01
Particle-laden turbulent flows are important features of many environmental and industrial processes. Euler-Euler (EE) simulations of these flows are more computationally efficient than Euler-Lagrange (EL) simulations. However, traditional EE methods, such as the two-fluid model, cannot faithfully capture dilute regions of flow with finite Stokes number particles. For this purpose, the multi-valued nature of the particle velocity field must be treated with a polykinetic description. Various quadrature-based moment methods (QBMM) can be used to approximate the full kinetic description by solving for a set of moments of the particle velocity distribution function (VDF) and providing closures for the higher-order moments. Early QBMM fail to maintain the strict hyperbolicity of the kinetic equations, producing unphysical delta shocks (i.e., mass accumulation at a point). In previous work, a 2-D conditional hyperbolic quadrature method of moments (CHyQMOM) was proposed as a fourth-order QBMM closure that maintains strict hyperbolicity. Here, we present the 3-D extension of CHyQMOM. We compare results from CHyQMOM to other QBMM and EL in the context of particle trajectory crossing, cluster-induced turbulence, and particle-laden channel flow. NSF CBET-1437903.
Nevo, Daniel; Nishihara, Reiko; Ogino, Shuji; Wang, Molin
2017-08-04
In the analysis of time-to-event data with multiple causes using a competing risks Cox model, often the cause of failure is unknown for some of the cases. The probability of a missing cause is typically assumed to be independent of the cause given the time of the event and covariates measured before the event occurred. In practice, however, the underlying missing-at-random assumption does not necessarily hold. Motivated by colorectal cancer molecular pathological epidemiology analysis, we develop a method to conduct valid analysis when additional auxiliary variables are available for cases only. We consider a weaker missing-at-random assumption, with missing pattern depending on the observed quantities, which include the auxiliary covariates. We use an informative likelihood approach that will yield consistent estimates even when the underlying model for missing cause of failure is misspecified. The superiority of our method over naive methods in finite samples is demonstrated by simulation study results. We illustrate the use of our method in an analysis of colorectal cancer data from the Nurses' Health Study cohort, where, apparently, the traditional missing-at-random assumption fails to hold.
Ang, H H
2008-06-01
The Drug Control Authority (DCA) of Malaysia implemented the phase three registration of traditional medicines on 1 January, 1992. A total of 100 products in various pharmaceutical dosage forms of a herbal preparation, containing Eugenia dyeriana, either single or combined preparations (more than one medicinal plant), were analyzed for the presence of lead contamination, using atomic absorption spectrophotometry. These samples were bought from different commercial sources in the Malaysian market, after performing a simple random sampling. Results showed that 22% of the above products failed to comply with the quality requirement for traditional medicines in Malaysia. Although this study showed that 78% of the products fully complied with the quality requirement for traditional medicines in Malaysia pertaining to lead, however, they cannot be assumed safe from lead contamination because of batch-to-batch inconsistency.
Closing Intelligence Gaps: Synchronizing the Collection Management Process
information flow. The US military divides the world into six distinct geographic areas with corresponding commanders managing risk and weighing...analyzed information , creating a mismatch between supply and demand. The result is a burden on all facets of the intelligence process. However, if the target...system, or problem requiring analysis is not collected, intelligence fails. Executing collection management under the traditional tasking process
Maintaining research traditions on place: diversity of thought and scientific progress
Michael E. Patterson; Daniel R. Williams
2005-01-01
Since the 1990s, numerous authors have expressed concerns about lack of conceptual clarity in research on place. Some authors suggest that place research has failed to evolve into a systematic and coherent body of knowledge. We believe recent critiques do not adequately characterize the state of knowledge in place research, but responding to the issues raised requires...
ERIC Educational Resources Information Center
Slover, Ed; Mandernach, Jean
2018-01-01
While it is well-established that nontraditional students are more likely to take online courses than their traditional-age counterparts, investigations of the learning equivalence between online and campus-based instruction typically fail to consider student age as a mediating factor in the learning experience. To examine learning outcomes as a…
ERIC Educational Resources Information Center
National Center for Educational Achievement, 2010
2010-01-01
Many policymakers and education leaders have embraced the Advanced Placement (AP) Program as a tool to strengthen the high school curriculum and prepare students for college. The popularity of the AP program among these policy leaders reflects their belief that the traditional high school curriculum has often failed to provide rigorous courses…
NCI, FNLCR Help Launch Pediatric MATCH Precision Medicine Trial | Poster
The National Cancer Institute and Children’s Oncology Group recently opened enrollment for a new Phase II trial of personalized precision cancer therapies. Called the Pediatric Molecular Analysis for Therapy Choice (Pediatric MATCH), the trial seeks to treat children and adolescents aged 1–21 whose solid tumors have failed to respond to or re-emerged after traditional cancer
School Crisis Management: A Model of Dynamic Responsiveness to Crisis Life Cycle
ERIC Educational Resources Information Center
Liou, Yi-Hwa
2015-01-01
Purpose: This study aims to analyze a school's crisis management and explore emerging aspects of its response to a school crisis. Traditional linear modes of analysis often fail to address complex crisis situations. The present study applied a dynamic crisis life cycle model that draws on chaos and complexity theory to a crisis management case,…
ERIC Educational Resources Information Center
Aston, Candice N.
2017-01-01
Traditionally, many of the problems experienced by Black girls were overshadowed by the ongoing crises facing Black Males. Although important, the focus on Blackness and masculinity often implicitly leaves young Black girls on the sidelines and fails to recognize their unique obstacles. Fortunately, there has been a new surge of social concern…
The Experience of Knowledge. Essays on American Education.
ERIC Educational Resources Information Center
Gardner, John Fentress
Part one of this book of essays concerns the need to bring thinking to life and the need for a spiritual concept of man. It argues that both traditional and progressive educators fail children in by not giving them a full sense of their humanity. It denies a conception of the universe divorced from idealism and founded only in scientific inquiry.…
NCI, FNLCR Help Launch Pediatric MATCH Precision Medicine Trial | Poster
The National Cancer Institute and Children’s Oncology Group recently opened enrollment for a new Phase II trial of personalized precision cancer therapies. Called the Pediatric Molecular Analysis for Therapy Choice (Pediatric MATCH), the trial seeks to treat children and adolescents aged 1–21 whose solid tumors have failed to respond to or re-emerged after traditional cancer treatments.
Shape Distributions of Nonlinear Dynamical Systems for Video-Based Inference.
Venkataraman, Vinay; Turaga, Pavan
2016-12-01
This paper presents a shape-theoretic framework for dynamical analysis of nonlinear dynamical systems which appear frequently in several video-based inference tasks. Traditional approaches to dynamical modeling have included linear and nonlinear methods with their respective drawbacks. A novel approach we propose is the use of descriptors of the shape of the dynamical attractor as a feature representation of nature of dynamics. The proposed framework has two main advantages over traditional approaches: a) representation of the dynamical system is derived directly from the observational data, without any inherent assumptions, and b) the proposed features show stability under different time-series lengths where traditional dynamical invariants fail. We illustrate our idea using nonlinear dynamical models such as Lorenz and Rossler systems, where our feature representations (shape distribution) support our hypothesis that the local shape of the reconstructed phase space can be used as a discriminative feature. Our experimental analyses on these models also indicate that the proposed framework show stability for different time-series lengths, which is useful when the available number of samples are small/variable. The specific applications of interest in this paper are: 1) activity recognition using motion capture and RGBD sensors, 2) activity quality assessment for applications in stroke rehabilitation, and 3) dynamical scene classification. We provide experimental validation through action and gesture recognition experiments on motion capture and Kinect datasets. In all these scenarios, we show experimental evidence of the favorable properties of the proposed representation.
Arthrodesis Using Pedicled Fibular Flap After Failed Infected Knee Arthroplasty
Minear, Steve C.; Lee, Gordon; Kahn, David; Goodman, Stuart
2011-01-01
Objective: Severe bone loss associated with failed revision total knee arthroplasty is a challenging scenario. The pedicled fibular flap is a method to obtain vascularized bone for use in knee arthrodesis after failure of a total knee arthroplasty, with substantial loss of bone. Methods: We report 2 successful knee arthrodeses using this method in patients with infected, failed multiply revised total knee arthroplasties. The failed prosthesis was removed, and the bones were aligned and stabilized. The fibular flap was then harvested, fed through a subcutaneous tunnel, and placed within the medullary canal at the arthrodesis site. The soft tissue was closed over the grafts and flaps. Results: Two elderly women presented with pain and drainage from previous total knee arthroplasties after multiple revisions. Arthrodeses were performed as described, and both patients were pain-free with the knee fused at 1 year. Conclusions: Thus, pedicled vascularized flaps are a viable alternative in the treatment of failed revision arthroplasty with large segmental bone loss. PMID:22132250
A success paradigm for project managers in the aerospace industry
NASA Astrophysics Data System (ADS)
Bauer, Barry Jon
Within the aerospace industry, project managers traditionally have been selected based on their technical competency. While this may lead to brilliant technical solutions to customer requirements, a lack of management ability can result in failed programs that over-run on cost, are late to critical path schedules, fail to fully utilize the diversity of talent available within the program team, and otherwise disappoint key stakeholders. This research study identifies the key competencies that a project manager should possess in order to successfully lead and manage a project in the aerospace industry. The research attempts to show evidence that within the aerospace industry, it is perceived that management competency is more important to project management success than only technical competence.
Confusing placebo effect with natural history in epilepsy: A big data approach.
Goldenholz, Daniel M; Moss, Robert; Scott, Jonathan; Auh, Sungyoung; Theodore, William H
2015-09-01
For unknown reasons, placebos reduce seizures in clinical trials in many patients. It is also unclear why some drugs showing statistical superiority to placebo in one trial may fail to do so in another. Using Seizuretracker.com, a patient-centered database of 684,825 seizures, we simulated "placebo" and "drug" trials. These simulations were employed to clarify the sources of placebo effects in epilepsy, and to identify methods of diminishing placebo effects. Simulation 1 included 9 trials with a 6-week baseline and 6-week test period, starting at time 0, 3, 6…24 months. Here, "placebo" reduced seizures regardless of study start time. Regression-to-the-mean persisted only for 3 to 6 months. Simulation 2 comprised a 6-week baseline and then 2 years of follow-up. Seizure frequencies continued to improve throughout follow-up. Although the group improved, individuals switched from improvement to worsening and back. Simulation 3 involved a placebo-controlled "drug" trial, to explore methods of placebo response reduction. An efficacious "drug" failed to demonstrate a significant effect compared with "placebo" (p = 0.12), although modifications either in study start time (p = 0.025) or baseline population reduction (p = 0.0028) allowed the drug to achieve a statistically significant effect compared with placebo. In epilepsy clinical trials, some seizure reduction traditionally attributed to placebo effect may reflect the natural course of the disease itself. Understanding these dynamics will allow future investigations into optimal clinical trial design and may lead to identification of more effective therapies. Ann Neurol 2015;78:329-336. © 2015 American Neurological Association.
Ventilation in the patient with unilateral lung disease.
Thomas, A R; Bryce, T L
1998-10-01
Severe ULD presents a challenge in ventilator management because of the marked asymmetry in the mechanics of the two lungs. The asymmetry may result from significant decreases or increases in the compliance of the involved lung. Traditional ventilator support may fail to produce adequate gas exchange in these situations and has the potential to cause further deterioration. Fortunately, conventional techniques can be safely and effectively applied in the majority of cases without having to resort to less familiar and potentially hazardous forms of support. In those circumstances when conventional ventilation is unsuccessful in restoring adequate gas exchange, lateral positioning and ILV have proved effective at improving and maintaining gas exchange. Controlled trials to guide clinical decision making are lacking. In patients who have processes associated with decreased compliance in the involved lung, lateral positioning may be a simple method of improving gas exchange but is associated with many practical limitations. ILV in these patients is frequently successful when differential PEEP is applied with the higher pressure to the involved lung. In patients in whom the pathology results in distribution of ventilation favoring the involved lung, particularly BPF, ILV can be used to supply adequate support while minimizing flow through the fistula and allowing it to close. The application of these techniques should be undertaken with an understanding of the pathophysiology of the underlying process; the reported experience with these techniques, including indications and successfully applied methods; and the potential problems encountered with their use. Fortunately, these modalities are infrequently required, but they provide a critical means of support when conventional techniques fail.
Neural net controlled tag gas sampling system for nuclear reactors
Gross, K.C.; Laug, M.T.; Lambert, J.B.; Herzog, J.P.
1997-02-11
A method and system are disclosed for providing a tag gas identifier to a nuclear fuel rod and analyze escaped tag gas to identify a particular failed nuclear fuel rod. The method and system include disposing a unique tag gas composition into a plenum of a nuclear fuel rod, monitoring gamma ray activity, analyzing gamma ray signals to assess whether a nuclear fuel rod has failed and is emitting tag gas, activating a tag gas sampling and analysis system upon sensing tag gas emission from a failed nuclear rod and evaluating the escaped tag gas to identify the particular failed nuclear fuel rod. 12 figs.
Classification between Failed Nodes and Left Nodes in Mobile Asset Tracking Systems †
Kim, Kwangsoo; Jin, Jae-Yeon; Jin, Seong-il
2016-01-01
Medical asset tracking systems track a medical device with a mobile node and determine its status as either in or out, because it can leave a monitoring area. Due to a failed node, this system may decide that a mobile asset is outside the area, even though it is within the area. In this paper, an efficient classification method is proposed to separate mobile nodes disconnected from a wireless sensor network between nodes with faults and a node that actually has left the monitoring region. The proposed scheme uses two trends extracted from the neighboring nodes of a disconnected mobile node. First is the trend in a series of the neighbor counts; the second is that of the ratios of the boundary nodes included in the neighbors. Based on such trends, the proposed method separates failed nodes from mobile nodes that are disconnected from a wireless sensor network without failures. The proposed method is evaluated using both real data generated from a medical asset tracking system and also using simulations with the network simulator (ns-2). The experimental results show that the proposed method correctly differentiates between failed nodes and nodes that are no longer in the monitoring region, including the cases that the conventional methods fail to detect. PMID:26901200
Enhanced capital-asset pricing model for the reconstruction of bipartite financial networks.
Squartini, Tiziano; Almog, Assaf; Caldarelli, Guido; van Lelyveld, Iman; Garlaschelli, Diego; Cimini, Giulio
2017-09-01
Reconstructing patterns of interconnections from partial information is one of the most important issues in the statistical physics of complex networks. A paramount example is provided by financial networks. In fact, the spreading and amplification of financial distress in capital markets are strongly affected by the interconnections among financial institutions. Yet, while the aggregate balance sheets of institutions are publicly disclosed, information on single positions is mostly confidential and, as such, unavailable. Standard approaches to reconstruct the network of financial interconnection produce unrealistically dense topologies, leading to a biased estimation of systemic risk. Moreover, reconstruction techniques are generally designed for monopartite networks of bilateral exposures between financial institutions, thus failing in reproducing bipartite networks of security holdings (e.g., investment portfolios). Here we propose a reconstruction method based on constrained entropy maximization, tailored for bipartite financial networks. Such a procedure enhances the traditional capital-asset pricing model (CAPM) and allows us to reproduce the correct topology of the network. We test this enhanced CAPM (ECAPM) method on a dataset, collected by the European Central Bank, of detailed security holdings of European institutional sectors over a period of six years (2009-2015). Our approach outperforms the traditional CAPM and the recently proposed maximum-entropy CAPM both in reproducing the network topology and in estimating systemic risk due to fire sales spillovers. In general, ECAPM can be applied to the whole class of weighted bipartite networks described by the fitness model.
Enhanced capital-asset pricing model for the reconstruction of bipartite financial networks
NASA Astrophysics Data System (ADS)
Squartini, Tiziano; Almog, Assaf; Caldarelli, Guido; van Lelyveld, Iman; Garlaschelli, Diego; Cimini, Giulio
2017-09-01
Reconstructing patterns of interconnections from partial information is one of the most important issues in the statistical physics of complex networks. A paramount example is provided by financial networks. In fact, the spreading and amplification of financial distress in capital markets are strongly affected by the interconnections among financial institutions. Yet, while the aggregate balance sheets of institutions are publicly disclosed, information on single positions is mostly confidential and, as such, unavailable. Standard approaches to reconstruct the network of financial interconnection produce unrealistically dense topologies, leading to a biased estimation of systemic risk. Moreover, reconstruction techniques are generally designed for monopartite networks of bilateral exposures between financial institutions, thus failing in reproducing bipartite networks of security holdings (e.g., investment portfolios). Here we propose a reconstruction method based on constrained entropy maximization, tailored for bipartite financial networks. Such a procedure enhances the traditional capital-asset pricing model (CAPM) and allows us to reproduce the correct topology of the network. We test this enhanced CAPM (ECAPM) method on a dataset, collected by the European Central Bank, of detailed security holdings of European institutional sectors over a period of six years (2009-2015). Our approach outperforms the traditional CAPM and the recently proposed maximum-entropy CAPM both in reproducing the network topology and in estimating systemic risk due to fire sales spillovers. In general, ECAPM can be applied to the whole class of weighted bipartite networks described by the fitness model.
Integrity of Disposable Nitrile Exam Gloves Exposed to Simulated Movement
Phalen, Robert N.; Wong, Weng Kee
2011-01-01
Every year, millions of health care, first responder, and industry workers are exposed to chemical and biological hazards. Disposable nitrile gloves are a common choice as both a chemical and physical barrier to these hazards, especially as an alternative to natural latex gloves. However, glove selection is complicated by the availability of several types or formulations of nitrile gloves, such as low-modulus, medical-grade, low-filler, and cleanroom products. This study evaluated the influence of simulated movement on the physical integrity (i.e., holes) of different nitrile exam glove brands and types. Thirty glove products were evaluated out-of-box and after exposure to simulated whole-glove movement for 2 hr. In lieu of the traditional 1-L water-leak test, a modified water-leak test, standardized to detect a 0.15 ± 0.05 mm hole in different regions of the glove, was developed. A specialized air inflation method simulated bidirectional stretching and whole-glove movement. A worst-case scenario with maximum stretching was evaluated. On average, movement did not have a significant effect on glove integrity (chi-square; p=0.068). The average effect was less than 1% between no movement (1.5%) and movement (2.1%) exposures. However, there was significant variability in glove integrity between different glove types (p ≤ 0.05). Cleanroom gloves, on average, had the highest percentage of leaks, and 50% failed the water-leak test. Low-modulus and medical-grade gloves had the lowest percentages of leaks, and no products failed the water-leak test. Variability in polymer formulation was suspected to account for the observed discrepancies, as well as the inability of the traditional 1-L water-leak test to detect holes in finger/thumb regions. Unexpectedly, greater than 80% of the glove defects were observed in the finger and thumb regions. It is recommended that existing water-leak tests be re-evaluated and standardized to account for product variability. PMID:21476169
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zhang, Wenping; Yie, Shangmian; Yue, Bisong; Zhou, Jielong; An, Renxiong; Yang, Jiangdong; Chen, Wangli; Wang, Chengdong; Zhang, Liang; Shen, Fujun; Yang, Guangyou; Hou, Rong; Zhang, Zhihe
2012-01-01
It has been recognized that other than habitat loss, degradation and fragmentation, the infection of the roundworm Baylisascaris schroederi (B. schroederi) is one of the major causes of death in wild giant pandas. However, the prevalence and intensity of the parasite infection has been inconsistently reported through a method that uses sedimentation-floatation followed by a microscope examination. This method fails to accurately determine infection because there are many bamboo residues and/or few B. schroederi eggs in the examined fecal samples. In the present study, we adopted a method that uses PCR and capillary electrophoresis combined with a single-strand conformation polymorphism analysis (PCR/CE-SSCP) to detect B. schroederi infection in wild giant pandas at a nature reserve, and compared it to the traditional microscope approach. The PCR specifically amplified a single band of 279-bp from both fecal samples and positive controls, which was confirmed by sequence analysis to correspond to the mitochondrial COII gene of B. schroederi. Moreover, it was demonstrated that the amount of genomic DNA was linearly correlated with the peak area of the CE-SSCP analysis. Thus, our adopted method can reliably detect the infectious prevalence and intensity of B. schroederi in wild giant pandas. The prevalence of B. schroederi was found to be 54% in the 91 fecal samples examined, and 48% in the fecal samples of 31 identified individual giant pandas. Infectious intensities of the 91 fecal samples were detected to range from 2.8 to 959.2 units/gram, and from 4.8 to 959.2 units/gram in the fecal samples of the 31 identified giant pandas. For comparison, by using the traditional microscope method, the prevalence of B. schroederi was found to be only 33% in the 91 fecal samples, 32% in the fecal samples of the 31 identified giant pandas, and no reliable infectious intensity was observed. PMID:22911871
Zhang, Wenping; Yie, Shangmian; Yue, Bisong; Zhou, Jielong; An, Renxiong; Yang, Jiangdong; Chen, Wangli; Wang, Chengdong; Zhang, Liang; Shen, Fujun; Yang, Guangyou; Hou, Rong; Zhang, Zhihe
2012-01-01
It has been recognized that other than habitat loss, degradation and fragmentation, the infection of the roundworm Baylisascaris schroederi (B. schroederi) is one of the major causes of death in wild giant pandas. However, the prevalence and intensity of the parasite infection has been inconsistently reported through a method that uses sedimentation-floatation followed by a microscope examination. This method fails to accurately determine infection because there are many bamboo residues and/or few B. schroederi eggs in the examined fecal samples. In the present study, we adopted a method that uses PCR and capillary electrophoresis combined with a single-strand conformation polymorphism analysis (PCR/CE-SSCP) to detect B. schroederi infection in wild giant pandas at a nature reserve, and compared it to the traditional microscope approach. The PCR specifically amplified a single band of 279-bp from both fecal samples and positive controls, which was confirmed by sequence analysis to correspond to the mitochondrial COII gene of B. schroederi. Moreover, it was demonstrated that the amount of genomic DNA was linearly correlated with the peak area of the CE-SSCP analysis. Thus, our adopted method can reliably detect the infectious prevalence and intensity of B. schroederi in wild giant pandas. The prevalence of B. schroederi was found to be 54% in the 91 fecal samples examined, and 48% in the fecal samples of 31 identified individual giant pandas. Infectious intensities of the 91 fecal samples were detected to range from 2.8 to 959.2 units/gram, and from 4.8 to 959.2 units/gram in the fecal samples of the 31 identified giant pandas. For comparison, by using the traditional microscope method, the prevalence of B. schroederi was found to be only 33% in the 91 fecal samples, 32% in the fecal samples of the 31 identified giant pandas, and no reliable infectious intensity was observed.
ERIC Educational Resources Information Center
White-Johnson, Adair F.
2001-01-01
Explored the perceptions of African American male students who chose to leave a traditional academic setting for an alternative education program within the same setting. Data from students, parents, and teachers indicated that teachers and educational leaders failed to incorporate the relevancy of African American male students into the classroom…
ERIC Educational Resources Information Center
Wilson, Vicki A.; Onwuegbuzie, Anthony J.
The traditional "black box" approach to evaluation of assignments in educational research courses has at least two effects: (1) products that fail to meet the expectations of the instructor; and (2) frustration on the part of students who do not know exactly what is expected, and who are consequently confused about or disappointed in the grades…
ERIC Educational Resources Information Center
Herrera-Ruiz, Octavio
2012-01-01
Peer-to-Peer (P2P) technology has emerged as an important alternative to the traditional client-server communication paradigm to build large-scale distributed systems. P2P enables the creation, dissemination and access to information at low cost and without the need of dedicated coordinating entities. However, existing P2P systems fail to provide…
Visual Purple, the Next Generation Crisis Management Decision Training Tool
2001-09-01
talents of professional Hollywood screenwriters during the scripting and writing process of the simulations. Additionally, cinematic techniques learned...cultural, and language experts for research development. Additionally, GTA provides country specific support in script writing and cinematic resources as...The result is an entirely new dimension of realism that traditional exercises often fail to capture. The scenario requires the participant to make the
ERIC Educational Resources Information Center
Rodwell, G. W.
2009-01-01
In 2000, Paula Wriedt, the Tasmanian Minister for Education, gave instructions for her department to begin the development of a K to 10 statewide curriculum, soon to become known as the Essential Learnings Framework, or simply, ELs. The curriculum was an integrated one, doing away with traditional subjects, or disciplines, such as mathematics,…
A New Ghost Cell/Level Set Method for Moving Boundary Problems: Application to Tumor Growth
Macklin, Paul
2011-01-01
In this paper, we present a ghost cell/level set method for the evolution of interfaces whose normal velocity depend upon the solutions of linear and nonlinear quasi-steady reaction-diffusion equations with curvature-dependent boundary conditions. Our technique includes a ghost cell method that accurately discretizes normal derivative jump boundary conditions without smearing jumps in the tangential derivative; a new iterative method for solving linear and nonlinear quasi-steady reaction-diffusion equations; an adaptive discretization to compute the curvature and normal vectors; and a new discrete approximation to the Heaviside function. We present numerical examples that demonstrate better than 1.5-order convergence for problems where traditional ghost cell methods either fail to converge or attain at best sub-linear accuracy. We apply our techniques to a model of tumor growth in complex, heterogeneous tissues that consists of a nonlinear nutrient equation and a pressure equation with geometry-dependent jump boundary conditions. We simulate the growth of glioblastoma (an aggressive brain tumor) into a large, 1 cm square of brain tissue that includes heterogeneous nutrient delivery and varied biomechanical characteristics (white matter, gray matter, cerebrospinal fluid, and bone), and we observe growth morphologies that are highly dependent upon the variations of the tissue characteristics—an effect observed in real tumor growth. PMID:21331304
Religio-Spiritual Participation in Two American Indian Populations
Garroutte, Eva Marie; Anderson, Heather Orton; Nez-Henderson, Patricia; Croy, Calvin; Beals, Janette; Henderson, Jeffrey A.; Thomas, Jacob; Manson, Spero M.
2015-01-01
Following a previous investigation of religio-spiritual beliefs in American Indians, this article examined prevalence and correlates of religio-spiritual participation in two tribes in the Southwest and Northern Plains (N = 3,084). Analysis suggested a “religious profile” characterized by strong participation across three traditions: aboriginal, Christian, and Native American Church. However, sociodemographic variables that have reliably predicted participation in the general American population, notably gender and age, frequently failed to achieve significance in multivariate analyses for each tradition. Religio-spiritual participation was strongly and significantly related to belief salience for all traditions. Findings suggest that correlates of religious participation may be unique among American Indians, consistent with their distinctive religious profile. Results promise to inform researchers’ efforts to understand and theorize about religio-spiritual behavior. They also provide tribal communities with practical information that might assist them in harnessing social networks to confront collective challenges through community-based participatory research collaborations. PMID:26582964
The Influence of Culture on the International Management of Shark Finning
NASA Astrophysics Data System (ADS)
Dell'Apa, Andrea; Chad Smith, M.; Kaneshiro-Pineiro, Mahealani Y.
2014-08-01
Shark finning is prohibited in many countries, but high prices for fins from the Asian market help maintain the international black-market and poaching. Traditional shark fin bans fail to recognize that the main driver of fin exploitation is linked to cultural beliefs about sharks in traditional Chinese culture. Therefore, shark finning should be addressed considering the social science approach as part of the fishery management scheme. This paper investigates the cultural significance of sharks in traditional Chinese and Hawaiian cultures, as valuable examples of how specific differences in cultural beliefs can drive individuals' attitudes toward the property of shark finning. We suggest the use of a social science approach that can be useful in the design of successful education campaigns to help change individuals' attitudes toward shark fin consumption. Finally, alternative management strategies for commercial fishers are provided to maintain self-sustainability of local coastal communities.
The influence of culture on the international management of shark finning.
Dell'Apa, Andrea; Smith, M Chad; Kaneshiro-Pineiro, Mahealani Y
2014-08-01
Shark finning is prohibited in many countries, but high prices for fins from the Asian market help maintain the international black-market and poaching. Traditional shark fin bans fail to recognize that the main driver of fin exploitation is linked to cultural beliefs about sharks in traditional Chinese culture. Therefore, shark finning should be addressed considering the social science approach as part of the fishery management scheme. This paper investigates the cultural significance of sharks in traditional Chinese and Hawaiian cultures, as valuable examples of how specific differences in cultural beliefs can drive individuals' attitudes toward the property of shark finning. We suggest the use of a social science approach that can be useful in the design of successful education campaigns to help change individuals' attitudes toward shark fin consumption. Finally, alternative management strategies for commercial fishers are provided to maintain self-sustainability of local coastal communities.
The History of the Internet Search Engine: Navigational Media and the Traffic Commodity
NASA Astrophysics Data System (ADS)
van Couvering, E.
This chapter traces the economic development of the search engine industry over time, beginning with the earliest Web search engines and ending with the domination of the market by Google, Yahoo! and MSN. Specifically, it focuses on the ways in which search engines are similar to and different from traditional media institutions, and how the relations between traditional and Internet media have changed over time. In addition to its historical overview, a core contribution of this chapter is the analysis of the industry using a media value chain based on audiences rather than on content, and the development of traffic as the core unit of exchange. It shows that traditional media companies failed when they attempted to create vertically integrated portals in the late 1990s, based on the idea of controlling Internet content, while search engines succeeded in creating huge "virtually integrated" networks based on control of Internet traffic rather than Internet content.
How to reconcile the multiculturalist and universalist approaches to science education
NASA Astrophysics Data System (ADS)
Hansson, Sven Ove
2017-06-01
The "multiculturalist" and "universalist" approaches to science education both fail to recognize the strong continuities between modern science and its forerunners in traditional societies. Various fact-finding practices in indigenous cultures exhibit the hallmarks of scientific investigations, such as collectively achieved rationality, a careful distinction between facts and values, a search for shared, well-founded judgments in empirical matters, and strivings for continuous improvement of these judgments. Prominent examples are hunters' discussions when tracking a prey, systematic agricultural experiments performed by indigenous farmers, and remarkably advanced experiments performed by craftspeople long before the advent of modern science. When the continuities between science and these prescientific practices are taken into account, it becomes obvious that the traditional forms of both multiculturalism and universalism should be replaced by a new approach that dissolves the alleged conflict between adherence to modern science and respect for traditional cultures.
Wang, Yanjie; Wang, Yanli; Sun, Xiaodong; Caiji, Zhuoma; Yang, Jingbiao; Cui, Di; Cao, Guilan; Ma, Xiaoding; Han, Bing; Xue, Dayuan; Han, Longzhi
2016-10-27
Crop genetic resources are important components of biodiversity. However, with the large-scale promotion of mono-cropping, genetic diversity has largely been lost. Ex-situ conservation approaches were widely used to protect traditional crop varieties worldwide. However, this method fails to maintain the dynamic evolutionary processes of crop genetic resources in their original habitats, leading to genetic diversity reduction and even loss of the capacity of resistance to new diseases and pests. Therefore, on-farm conservation has been considered a crucial complement to ex-situ conservation. This study aimed at clarifying the genetic diversity differences between ex-situ conservation and on-farm conservation and to exploring the influence of traditional cultures on genetic diversity of rice landraces under on-farm conservation. The conservation status of rice landrace varieties, including Indica and Japonica, non-glutinous rice (Oryza sativa) and glutinous rice (Oryza sativa var. glutinosa Matsum), was obtained through ethno-biology investigation method in 12 villages of ethnic groups from Guizhou, Yunnan and Guangxi provinces of China. The genetic diversity between 24 pairs of the same rice landraces from different times were compared using simple sequence repeat (SSR) molecular markers technology. The landrace paris studied were collected in 1980 and maintained ex-situ, while 2014 samples were collected on-farm in southwest of China. The results showed that many varieties of rice landraces have been preserved on-farm by local farmers for hundreds or thousands of years. The number of alleles (Na), effective number of alleles (Ne), Nei genetic diversity index (He) and Shannon information index (I) of rice landraces were significantly higher by 12.3-30.4 % under on-farm conservation than under ex-situ conservation. Compared with the ex-situ conservation approach, rice landraces under on-farm conservation programs had more alleles and higher genetic diversity. In every site we investigated, ethnic traditional cultures play a positive influence on rice landrace variety diversity and genetic diversity. Most China's rice landraces were conserved in the ethnic areas of southwest China. On-farm conservation can effectively promote the allelic variation and increase the genetic diversity of rice landraces over the past 35 years. Moreover, ethnic traditional culture practices are a crucial foundation to increase genetic diversity of rice landraces and implement on-farm conservation.
Dynamic Blowout Risk Analysis Using Loss Functions.
Abimbola, Majeed; Khan, Faisal
2018-02-01
Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.
Ruohonen, Toni; Ennejmy, Mohammed
2013-01-01
Making reliable and justified operational and strategic decisions is a really challenging task in the health care domain. So far, the decisions have been made based on the experience of managers and staff, or they are evaluated with traditional methods, using inadequate data. As a result of this kind of decision-making process, attempts to improve operations usually have failed or led to only local improvements. Health care organizations have a lot of operational data, in addition to clinical data, which is the key element for making reliable and justified decisions. However, it is progressively problematic to access it and make usage of it. In this paper we discuss about the possibilities how to exploit operational data in the most efficient way in the decision-making process. We'll share our future visions and propose a conceptual framework for automating the decision-making process.
Walking and talking the tree of life: Why and how to teach about biodiversity
Ballen, Cissy J.; Greene, Harry W.
2017-01-01
Taxonomic details of diversity are an essential scaffolding for biology education, yet outdated methods for teaching the tree of life (TOL), as implied by textbook content and usage, are still commonly employed. Here, we show that the traditional approach only vaguely represents evolutionary relationships, fails to denote major events in the history of life, and relies heavily on memorizing near-meaningless taxonomic ranks. Conversely, a clade-based strategy—focused on common ancestry, monophyletic groups, and derived functional traits—is explicitly based on Darwin’s “descent with modification,” provides students with a rational system for organizing the details of biodiversity, and readily lends itself to active learning techniques. We advocate for a phylogenetic classification that mirrors the TOL, a pedagogical format of increasingly complex but always hierarchical presentations, and the adoption of active learning technologies and tactics. PMID:28319149
NASA Astrophysics Data System (ADS)
Afeyan, Bedros; Larson, David; Shadwick, Bradley; Sydora, Richard
2017-10-01
We compare various ways of solving the Vlasov-Poisson and Vlasov-Maxwell equations on rather demanding nonlinear kinetic phenomena associated with KEEN and KEEPN waves. KEEN stands for Kinetic, Electrostatic, Electron Nonlinear, and KEEPN, for electron-positron or pair plasmas analogs. Because these self-organized phase space structures are not steady-state, or single mode, or fluid or low order moment equation limited, typical techniques with low resolution or too much noise will distort the answer too much, too soon, and fail. This will be shown via Penrose criteria triggers for instability at the formation stage as well as particle orbit statistics in fully formed KEEN waves and KEEN-KEEN and KEEN-EPW interacting states. We will argue that PASTEL is a viable alternative to traditional methods with reasonable chances of success in higher dimensions. Work supported by a Grant from AFOSR PEEP.
Accurate multiple sequence-structure alignment of RNA sequences using combinatorial optimization.
Bauer, Markus; Klau, Gunnar W; Reinert, Knut
2007-07-27
The discovery of functional non-coding RNA sequences has led to an increasing interest in algorithms related to RNA analysis. Traditional sequence alignment algorithms, however, fail at computing reliable alignments of low-homology RNA sequences. The spatial conformation of RNA sequences largely determines their function, and therefore RNA alignment algorithms have to take structural information into account. We present a graph-based representation for sequence-structure alignments, which we model as an integer linear program (ILP). We sketch how we compute an optimal or near-optimal solution to the ILP using methods from combinatorial optimization, and present results on a recently published benchmark set for RNA alignments. The implementation of our algorithm yields better alignments in terms of two published scores than the other programs that we tested: This is especially the case with an increasing number of input sequences. Our program LARA is freely available for academic purposes from http://www.planet-lisa.net.
Teaching the nature of physics through art: a new art of teaching
NASA Astrophysics Data System (ADS)
Colletti, Leonardo
2018-01-01
Science and art are traditionally represented as two disciplines with completely divergent goals, methods, and public. It has been claimed that, if rightly addressed, science and art education could mutually support each other. In this paper I propose the recurrent reference to certain famous paintings during the ordinary progress of physics courses in secondary schools, in order to convey, in a memorable way, some basic features of physics methodology. For an understanding of the overall characteristics of science should be regarded as one of the crucial goals of physics education. As a part of a general education, the forgetting of physics concepts may be acceptable, but failing to grasp the very nature of science is not. Images may help in conveying the nature of science, especially for humanities-oriented students. Moreover, famous paintings, with their familiarity and availability, are a valid tool in facilitating this.
Narrowing the scope of failure prediction using targeted fault load injection
NASA Astrophysics Data System (ADS)
Jordan, Paul L.; Peterson, Gilbert L.; Lin, Alan C.; Mendenhall, Michael J.; Sellers, Andrew J.
2018-05-01
As society becomes more dependent upon computer systems to perform increasingly critical tasks, ensuring that those systems do not fail becomes increasingly important. Many organizations depend heavily on desktop computers for day-to-day operations. Unfortunately, the software that runs on these computers is written by humans and, as such, is still subject to human error and consequent failure. A natural solution is to use statistical machine learning to predict failure. However, since failure is still a relatively rare event, obtaining labelled training data to train these models is not a trivial task. This work presents new simulated fault-inducing loads that extend the focus of traditional fault injection techniques to predict failure in the Microsoft enterprise authentication service and Apache web server. These new fault loads were successful in creating failure conditions that were identifiable using statistical learning methods, with fewer irrelevant faults being created.
New directions in analyses of parenting contributions to children's acquisition of values.
Grusec, J E; Goodnow, J J; Kuczynski, L
2000-01-01
Traditional theories of how children acquire values or standards of behavior have emphasized the importance of specific parenting techniques or styles and have acknowledged the importance of a responsive parent-child relationship, but they have failed to differentiate among forms of responsiveness, have stressed internalization of values as the desired outcome, and have limited their scope to a small set of parenting strategies or methods. This paper outlines new directions for research. It acknowledges the central importance of parents and argues for research that (1) demonstrates that parental understanding of a particular child's characteristics and situation rather than use of specific strategies or styles is the mark of effective parenting; (2) traces the differential impact of varieties of parent responsiveness; (3) assesses the conditions surrounding the fact that parents have goals other than internalization when socializing their children, and evaluates the impact of that fact; and (4) considers a wider range of parenting strategies.
The bingo model of survivorship: 1. probabilistic aspects.
Murphy, E A; Trojak, J E; Hou, W; Rohde, C A
1981-01-01
A "bingo" model is one in which the pattern of survival of a system is determined by whichever of several components, each with its own particular distribution for survival, fails first. The model is motivated by the study of lifespan in animals. A number of properties of such systems are discussed in general. They include the use of a special criterion of skewness that probably corresponds more closely than traditional measures to what the eye observes in casually inspecting data. This criterion is the ratio, r(h), of the probability density at a point an arbitrary distance, h, above the mode to that an equal distance below the mode. If this ratio is positive for all positive arguments, the distribution is considered positively asymmetrical and conversely. Details of the bingo model are worked out for several types of base distributions: the rectangular, the triangular, the logistic, and by numerical methods, the normal, lognormal, and gamma.
"Heroes" and "villains" of world history across cultures.
Hanke, Katja; Liu, James H; Sibley, Chris G; Paez, Dario; Gaines, Stanley O; Moloney, Gail; Leong, Chan-Hoong; Wagner, Wolfgang; Licata, Laurent; Klein, Olivier; Garber, Ilya; Böhm, Gisela; Hilton, Denis J; Valchev, Velichko; Khan, Sammyh S; Cabecinhas, Rosa
2015-01-01
Emergent properties of global political culture were examined using data from the World History Survey (WHS) involving 6,902 university students in 37 countries evaluating 40 figures from world history. Multidimensional scaling and factor analysis techniques found only limited forms of universality in evaluations across Western, Catholic/Orthodox, Muslim, and Asian country clusters. The highest consensus across cultures involved scientific innovators, with Einstein having the most positive evaluation overall. Peaceful humanitarians like Mother Theresa and Gandhi followed. There was much less cross-cultural consistency in the evaluation of negative figures, led by Hitler, Osama bin Laden, and Saddam Hussein. After more traditional empirical methods (e.g., factor analysis) failed to identify meaningful cross-cultural patterns, Latent Profile Analysis (LPA) was used to identify four global representational profiles: Secular and Religious Idealists were overwhelmingly prevalent in Christian countries, and Political Realists were common in Muslim and Asian countries. We discuss possible consequences and interpretations of these different representational profiles.
PATENTS IN GENOMICS AND HUMAN GENETICS
Cook-Deegan, Robert; Heaney, Christopher
2010-01-01
Genomics and human genetics are scientifically fundamental and commercially valuable. These fields grew to prominence in an era of growth in government and nonprofit research funding, and of even greater growth of privately funded research and development in biotechnology and pharmaceuticals. Patents on DNA technologies are a central feature of this story, illustrating how patent law adapts---and sometimes fails to adapt---to emerging genomic technologies. In instrumentation and for therapeutic proteins, patents have largely played their traditional role of inducing investment in engineering and product development, including expensive postdiscovery clinical research to prove safety and efficacy. Patents on methods and DNA sequences relevant to clinical genetic testing show less evidence of benefits and more evidence of problems and impediments, largely attributable to university exclusive licensing practices. Whole-genome sequencing will confront uncertainty about infringing granted patents but jurisprudence trends away from upholding the broadest and potentially most troublesome patent claims. PMID:20590431
An efficient smolt trap for sandy and debris-laden streams
Scace, J.G.; Letcher, B.H.; Noreika, J.
2007-01-01
Tripod weir and box traps are traditionally used to capture and enumerate out-migrating salmonid smolts in short-term studies and in streams where temporary or portable traps are the only practical option. Although traditional traps can be effective when conditions are ideal, they are often unable to withstand high-discharge events in streams containing a large amount of debris or sandy substrates. We created a rotary-screw trap and resistance board weir hybrid design that we evaluated along with a tripod weir and box trap, both in a 6.1-m-wide flume and in the field. The new design outperformed the tripod weir in both situations. The tripod weir failed in 10 min in the flume trial, whereas the new design was still operating at the conclusion of an 8-h trial under the same conditions. The new design operated continuously in the field during a high-discharge event that caused the tripod weir to fail. The new design also required less frequent cleaning than the tripod weir. The trap efficiency of the new design was estimated by using passive integrated transponder (PIT) tag antennas and radiotelemetry. The trap was 80% efficient (n = 40) in capturing migrating PIT-tagged individuals detected at an antenna upstream of the trap and 87.5% efficient (n = 48) at recapturing fish that had been tagged and released upstream. With its high efficiency and increased resiliency over the tripod weir, the new trap design will benefit management and research efforts in streams where traditional traps are unsuitable. ?? Copyright by the American Fisheries Society 2007.
Coping Styles of Failing Brunei Vocational Students
ERIC Educational Resources Information Center
Mundia, Lawrence; Salleh, Sallimah
2017-01-01
Purpose: The purpose of this paper is to determine the prevalence of two types of underachieving students (n = 246) (active failing (AF) and passive failing (PF)) in Brunei vocational and technical education (VTE) institutions and their patterns of coping. Design/methodology/approach: The field survey method was used to directly reach many…
Why good projects fail anyway.
Matta, Nadim F; Ashkenas, Ronald N
2003-09-01
Big projects fail at an astonishing rate--more than half the time, by some estimates. It's not hard to understand why. Complicated long-term projects are customarily developed by a series of teams working along parallel tracks. If managers fail to anticipate everything that might fall through the cracks, those tracks will not converge successfully at the end to reach the goal. Take a companywide CRM project. Traditionally, one team might analyze customers, another select the software, a third develop training programs, and so forth. When the project's finally complete, though, it may turn out that the salespeople won't enter in the requisite data because they don't understand why they need to. This very problem has, in fact, derailed many CRM programs at major organizations. There is a way to uncover unanticipated problems while the project is still in development. The key is to inject into the overall plan a series of miniprojects, or "rapid-results initiatives," which each have as their goal a miniature version of the overall goal. In the CRM project, a single team might be charged with increasing the revenues of one sales group in one region by 25% within four months. To reach that goal, team members would have to draw on the work of all the parallel teams. But in just four months, they would discover the salespeople's resistance and probably other unforeseen issues, such as, perhaps, the need to divvy up commissions for joint-selling efforts. The World Bank has used rapid-results initiatives to great effect to keep a sweeping 16-year project on track and deliver visible results years ahead of schedule. In taking an in-depth look at this project, and others, the authors show why this approach is so effective and how the initiatives are managed in conjunction with more traditional project activities.
Mining Social Media and Web Searches For Disease Detection
Yang, Y. Tony; Horneffer, Michael; DiLisio, Nicole
2013-01-01
Web-based social media is increasingly being used across different settings in the health care industry. The increased frequency in the use of the Internet via computer or mobile devices provides an opportunity for social media to be the medium through which people can be provided with valuable health information quickly and directly. While traditional methods of detection relied predominately on hierarchical or bureaucratic lines of communication, these often failed to yield timely and accurate epidemiological intelligence. New web-based platforms promise increased opportunities for a more timely and accurate spreading of information and analysis. This article aims to provide an overview and discussion of the availability of timely and accurate information. It is especially useful for the rapid identification of an outbreak of an infectious disease that is necessary to promptly and effectively develop public health responses. These web-based platforms include search queries, data mining of web and social media, process and analysis of blogs containing epidemic key words, text mining, and geographical information system data analyses. These new sources of analysis and information are intended to complement traditional sources of epidemic intelligence. Despite the attractiveness of these new approaches, further study is needed to determine the accuracy of blogger statements, as increases in public participation may not necessarily mean the information provided is more accurate. PMID:25170475
Palta, Jatinder R; Liu, Chihray; Li, Jonathan G
2008-01-01
The traditional prescriptive quality assurance (QA) programs that attempt to ensure the safety and reliability of traditional external beam radiation therapy are limited in their applicability to such advanced radiation therapy techniques as three-dimensional conformal radiation therapy, intensity-modulated radiation therapy, inverse treatment planning, stereotactic radiosurgery/radiotherapy, and image-guided radiation therapy. The conventional QA paradigm, illustrated by the American Association of Physicists in Medicine Radiation Therapy Committee Task Group 40 (TG-40) report, consists of developing a consensus menu of tests and device performance specifications from a generic process model that is assumed to apply to all clinical applications of the device. The complexity, variation in practice patterns, and level of automation of high-technology radiotherapy renders this "one-size-fits-all" prescriptive QA paradigm ineffective or cost prohibitive if the high-probability error pathways of all possible clinical applications of the device are to be covered. The current approaches to developing comprehensive prescriptive QA protocols can be prohibitively time consuming and cost ineffective and may sometimes fail to adequately safeguard patients. It therefore is important to evaluate more formal error mitigation and process analysis methods of industrial engineering to more optimally focus available QA resources on process components that have a significant likelihood of compromising patient safety or treatment outcomes.
A Glimpse in the Third Dimension for Electrical Resistivity Profiles
NASA Astrophysics Data System (ADS)
Robbins, A. R.; Plattner, A.
2017-12-01
We present an electrode layout strategy designed to enhance the popular two-dimensional electrical resistivity profile. Offsetting electrodes from the traditional linear layout and using 3-D inversion software allows for mapping the three-dimensional electrical resistivity close to the profile plane. We established a series of synthetic tests using simulated data generated from chosen resistivity distributions with a three-dimensional target feature. All inversions and simulations were conducted using freely-available ERT software, BERT and E4D. Synthetic results demonstrate the effectiveness of the offset electrode approach, whereas the linear layout failed to resolve the three-dimensional character of our subsurface feature. A field survey using trench backfill as a known resistivity contrast confirmed our synthetic tests. As we show, 3-D inversions of linear layouts for starting models without previously known structure are futile ventures because they generate symmetric resistivity solutions with respect to the profile plane. This is a consequence of the layout's inherent symmetrical sensitivity patterns. An offset electrode layout is not subject to the same limitation, as the collective measurements do not share a common sensitivity symmetry. For practitioners, this approach presents a low-cost improvement of a traditional geophysical method which is simple to use yet may provide critical information about the three dimensional structure of the subsurface close to the profile.
Wang, Jingang; Gao, Can; Yang, Jie
2014-01-01
Currently available traditional electromagnetic voltage sensors fail to meet the measurement requirements of the smart grid, because of low accuracy in the static and dynamic ranges and the occurrence of ferromagnetic resonance attributed to overvoltage and output short circuit. This work develops a new non-contact high-bandwidth voltage measurement system for power equipment. This system aims at the miniaturization and non-contact measurement of the smart grid. After traditional D-dot voltage probe analysis, an improved method is proposed. For the sensor to work in a self-integrating pattern, the differential input pattern is adopted for circuit design, and grounding is removed. To prove the structure design, circuit component parameters, and insulation characteristics, Ansoft Maxwell software is used for the simulation. Moreover, the new probe was tested on a 10 kV high-voltage test platform for steady-state error and transient behavior. Experimental results ascertain that the root mean square values of measured voltage are precise and that the phase error is small. The D-dot voltage sensor not only meets the requirement of high accuracy but also exhibits satisfactory transient response. This sensor can meet the intelligence, miniaturization, and convenience requirements of the smart grid. PMID:25036333
Alastruey, Jordi; Hunt, Anthony A E; Weinberg, Peter D
2014-01-01
We present a novel analysis of arterial pulse wave propagation that combines traditional wave intensity analysis with identification of Windkessel pressures to account for the effect on the pressure waveform of peripheral wave reflections. Using haemodynamic data measured in vivo in the rabbit or generated numerically in models of human compliant vessels, we show that traditional wave intensity analysis identifies the timing, direction and magnitude of the predominant waves that shape aortic pressure and flow waveforms in systole, but fails to identify the effect of peripheral reflections. These reflections persist for several cardiac cycles and make up most of the pressure waveform, especially in diastole and early systole. Ignoring peripheral reflections leads to an erroneous indication of a reflection-free period in early systole and additional error in the estimates of (i) pulse wave velocity at the ascending aorta given by the PU–loop method (9.5% error) and (ii) transit time to a dominant reflection site calculated from the wave intensity profile (27% error). These errors decreased to 1.3% and 10%, respectively, when accounting for peripheral reflections. Using our new analysis, we investigate the effect of vessel compliance and peripheral resistance on wave intensity, peripheral reflections and reflections originating in previous cardiac cycles. PMID:24132888
NASA Astrophysics Data System (ADS)
Burton, Jason C.; Wang, Shang; Behringer, Richard R.; Larina, Irina V.
2016-03-01
Infertility is a known major health concern and is estimated to impact ~15% of couples in the U.S. The majority of failed pregnancies occur before or during implantation of the fertilized embryo into the uterus. Understanding the mechanisms regulating development by studying mouse reproductive organs could significantly contribute to an improved understanding of normal development of reproductive organs and developmental causes of infertility in humans. Towards this goal, we report a three-dimensional (3D) imaging study of the developing mouse reproductive organs (ovary, oviduct, and uterus) using optical coherence tomography (OCT). In our study, OCT was used for 3D imaging of reproductive organs without exogenous contrast agents and provides micro-scale spatial resolution. Experiments were conducted in vitro on mouse reproductive organs ranging from the embryonic day 14.5 to adult stages. Structural features of the ovary, oviduct, and uterus are presented. Additionally, a comparison with traditional histological analysis is illustrated. These results provide a basis for a wide range of infertility studies in mouse models. Through integration with traditional genetic and molecular biology approaches, this imaging method can improve understanding of ovary, oviduct, and uterus development and function, serving to further contribute to our understanding of fertility and infertility.
Effectiveness of using blended learning strategies for teaching and learning human anatomy.
Pereira, José A; Pleguezuelos, Eulogio; Merí, Alex; Molina-Ros, Antoni; Molina-Tomás, M Carmen; Masdeu, Carlos
2007-02-01
This study aimed to implement innovative teaching methods--blended learning strategies--that include the use of new information technologies in the teaching of human anatomy and to analyse both the impact of these strategies on academic performance, and the degree of user satisfaction. The study was carried out among students in Year 1 of the biology degree curriculum (human biology profile) at Pompeu Fabra University, Barcelona. Two groups of students were tested on knowledge of the anatomy of the locomotor system and results compared between groups. Blended learning strategies were employed in 1 group (BL group, n = 69); the other (TT group; n = 65) received traditional teaching aided by complementary material that could be accessed on the Internet. Both groups were evaluated using the same types of examination. The average marks presented statistically significant differences (BL 6.3 versus TT 5.0; P < 0.0001). The percentage pass rate for the subject in the first call was higher in the BL group (87.9% versus 71.4%; P = 0.02), reflecting a lower incidence of students who failed to sit the examination (BL 4.3% versus TT 13.8%; P = 0.05). There were no differences regarding overall satisfaction with the teaching received. Blended learning was more effective than traditional teaching for teaching human anatomy.
Rosso, Edoardo; McGrath, Richard
2016-02-29
Issue addressed: Recently arrived migrants and refugees from a culturally and linguistically diverse background (CALD) may be particularly vulnerable to social exclusion. Participation in sport is endorsed as a vehicle to ease the resettlement process; however, in Australia, this is often thought as a simple matter of integration into existing sport structures (e.g. clubs). This approach fails to place actual community needs at the centre of sport engagement efforts. Methods: A consultation framework was established with South Australian CALD community leaders and organisations to scope needs for community-based alternatives to participation in traditional sport (e.g. clubs), co-design a suitable community sport program and pilot it in five communities. Interviews and questionnaire surveys were conducted with participants, community representatives, stakeholders and volunteers. Results: Regular, free soccer activities engaged 263 young people from a great variety of nationalities, including over 50% refugees, in secondary state school and community-based sites. Conclusion: Alternative community sport programs can provide a basic but valuable forum to promote physical activity and associated well being in CALD and refugee communities. So what?: Alternative approaches can extend the health benefits of sport participation to disadvantaged children and youth who are excluded from traditional sport participation opportunities.
Mining social media and web searches for disease detection.
Yang, Y Tony; Horneffer, Michael; DiLisio, Nicole
2013-04-28
Web-based social media is increasingly being used across different settings in the health care industry. The increased frequency in the use of the Internet via computer or mobile devices provides an opportunity for social media to be the medium through which people can be provided with valuable health information quickly and directly. While traditional methods of detection relied predominately on hierarchical or bureaucratic lines of communication, these often failed to yield timely and accurate epidemiological intelligence. New web-based platforms promise increased opportunities for a more timely and accurate spreading of information and analysis. This article aims to provide an overview and discussion of the availability of timely and accurate information. It is especially useful for the rapid identification of an outbreak of an infectious disease that is necessary to promptly and effectively develop public health responses. These web-based platforms include search queries, data mining of web and social media, process and analysis of blogs containing epidemic key words, text mining, and geographical information system data analyses. These new sources of analysis and information are intended to complement traditional sources of epidemic intelligence. Despite the attractiveness of these new approaches, further study is needed to determine the accuracy of blogger statements, as increases in public participation may not necessarily mean the information provided is more accurate.
Bayesian-network-based safety risk assessment for steel construction projects.
Leu, Sou-Sen; Chang, Ching-Miao
2013-05-01
There are four primary accident types at steel building construction (SC) projects: falls (tumbles), object falls, object collapse, and electrocution. Several systematic safety risk assessment approaches, such as fault tree analysis (FTA) and failure mode and effect criticality analysis (FMECA), have been used to evaluate safety risks at SC projects. However, these traditional methods ineffectively address dependencies among safety factors at various levels that fail to provide early warnings to prevent occupational accidents. To overcome the limitations of traditional approaches, this study addresses the development of a safety risk-assessment model for SC projects by establishing the Bayesian networks (BN) based on fault tree (FT) transformation. The BN-based safety risk-assessment model was validated against the safety inspection records of six SC building projects and nine projects in which site accidents occurred. The ranks of posterior probabilities from the BN model were highly consistent with the accidents that occurred at each project site. The model accurately provides site safety-management abilities by calculating the probabilities of safety risks and further analyzing the causes of accidents based on their relationships in BNs. In practice, based on the analysis of accident risks and significant safety factors, proper preventive safety management strategies can be established to reduce the occurrence of accidents on SC sites. Copyright © 2013 Elsevier Ltd. All rights reserved.
Professionally responsible malpractice reform.
Brody, Howard; Hermer, Laura D
2011-07-01
Medical malpractice reform is both necessary and desirable, yet certain types of reform are clearly preferable to others. We argue that "traditional" tort reform remedies such as stringent damage caps not only fail to address the root causes of negligence and the adverse effects that fear of suit can have on physicians, but also fail to address the needs of patients. Physicians ought to view themselves as professionals who are dedicated to putting patients' interests ahead of their own. Professionally responsible malpractice reform should therefore be at least as patient-centered as it is physician-centered. Examples of more professionally responsible malpractice reform exist where institutions take a pro-active approach to identification, investigation, and remediation of possible malpractice. Such programs should be implemented more generally, and state laws enacted to facilitate them.
Vercellotti, Tomaso; Stacchi, Claudio; Russo, Crescenzo; Rebaudi, Alberto; Vincenzi, Giampaolo; Pratella, Umberto; Baldi, Domenico; Mozzati, Marco; Monagheddu, Chiara; Sentineri, Rosario; Cuneo, Tommaso; Di Alberti, Luca; Carossa, Stefano; Schierano, Gianmario
2014-01-01
This multicenter case series introduces an innovative ultrasonic implant site preparation (UISP) technique as an alternative to the use of traditional rotary instruments. A total of 3,579 implants were inserted in 1,885 subjects, and the sites were prepared using a specific ultrasonic device with a 1- to 3-year follow-up. No surgical complications related to the UISP protocol were reported for any of the implant sites. Seventy-eight implants (59 maxillary, 19 mandibular) failed within 5 months of insertion, for an overall osseointegration percentage of 97.82% (97.14% maxilla, 98.75% mandible). Three maxillary implants failed after 3 years of loading, with an overall implant survival rate of 97.74% (96.99% maxilla, 98.75% mandible).
Zhou, Ronggang; Chan, Alan H S
2017-01-01
In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.
Efficient Agent-Based Cluster Ensembles
NASA Technical Reports Server (NTRS)
Agogino, Adrian; Tumer, Kagan
2006-01-01
Numerous domains ranging from distributed data acquisition to knowledge reuse need to solve the cluster ensemble problem of combining multiple clusterings into a single unified clustering. Unfortunately current non-agent-based cluster combining methods do not work in a distributed environment, are not robust to corrupted clusterings and require centralized access to all original clusterings. Overcoming these issues will allow cluster ensembles to be used in fundamentally distributed and failure-prone domains such as data acquisition from satellite constellations, in addition to domains demanding confidentiality such as combining clusterings of user profiles. This paper proposes an efficient, distributed, agent-based clustering ensemble method that addresses these issues. In this approach each agent is assigned a small subset of the data and votes on which final cluster its data points should belong to. The final clustering is then evaluated by a global utility, computed in a distributed way. This clustering is also evaluated using an agent-specific utility that is shown to be easier for the agents to maximize. Results show that agents using the agent-specific utility can achieve better performance than traditional non-agent based methods and are effective even when up to 50% of the agents fail.
Egocentric Mapping of Body Surface Constraints.
Molla, Eray; Debarba, Henrique Galvan; Boulic, Ronan
2018-07-01
The relative location of human body parts often materializes the semantics of on-going actions, intentions and even emotions expressed, or performed, by a human being. However, traditional methods of performance animation fail to correctly and automatically map the semantics of performer postures involving self-body contacts onto characters with different sizes and proportions. Our method proposes an egocentric normalization of the body-part relative distances to preserve the consistency of self contacts for a large variety of human-like target characters. Egocentric coordinates are character independent and encode the whole posture space, i.e., it ensures the continuity of the motion with and without self-contacts. We can transfer classes of complex postures involving multiple interacting limb segments by preserving their spatial order without depending on temporal coherence. The mapping process exploits a low-cost constraint relaxation technique relying on analytic inverse kinematics; thus, we can achieve online performance animation. We demonstrate our approach on a variety of characters and compare it with the state of the art in online retargeting with a user study. Overall, our method performs better than the state of the art, especially when the proportions of the animated character deviate from those of the performer.
An optimization framework for measuring spatial access over healthcare networks.
Li, Zihao; Serban, Nicoleta; Swann, Julie L
2015-07-17
Measurement of healthcare spatial access over a network involves accounting for demand, supply, and network structure. Popular approaches are based on floating catchment areas; however the methods can overestimate demand over the network and fail to capture cascading effects across the system. Optimization is presented as a framework to measure spatial access. Questions related to when and why optimization should be used are addressed. The accuracy of the optimization models compared to the two-step floating catchment area method and its variations is analytically demonstrated, and a case study of specialty care for Cystic Fibrosis over the continental United States is used to compare these approaches. The optimization models capture a patient's experience rather than their opportunities and avoid overestimating patient demand. They can also capture system effects due to change based on congestion. Furthermore, the optimization models provide more elements of access than traditional catchment methods. Optimization models can incorporate user choice and other variations, and they can be useful towards targeting interventions to improve access. They can be easily adapted to measure access for different types of patients, over different provider types, or with capacity constraints in the network. Moreover, optimization models allow differences in access in rural and urban areas.
A simplified method for identification of human cardiac myosin heavy-chain isoforms.
Piao, Shengfu; Yu, Fushun; Mihm, Michael J; Reiser, Peter J; McCarthy, Patrick M; Van Wagoner, David R; Bauer, John Anthony
2003-02-01
Cardiac myosin is a central participant in the cross-bridge cycling that mediates myocyte contraction and consists of multiple subunits that mediate both hydrolysis of ATP and mechanical production of contractile force Two isoforms of myosin heavy chain (MHC- alpha and MHC- beta ) are known to exist in mammalian cardiac tissue, and it is within this myosin subunit that ATPase activity resides. These isoforms differ by less than 0.2% in total molecular mass and amino acid sequence, but, strikingly, influence the rate and efficiency of energy utilization for generation of contractile force. Changes in the MHC- alpha /MHC- beta ratio has been classically viewed as an adaptation of a failing myocyte in both animal models and humans; however, their measurement has traditionally required specialized preparations and materials for sufficient resolution. Here we describe a greatly simplified method for routine assessments of myosin isoform composition in human cardiac tissues. The primary advantages of our approach include higher throughput and reduced supply costs with no apparent loss of statistical power, reproducibility or achieved results. Use of this more convenient method may provide enhanced access to an otherwise specialized technique and could provide additional opportunity for investigation of cardiac myocyte adaptive changes.
Metric learning with spectral graph convolutions on brain connectivity networks.
Ktena, Sofia Ira; Parisot, Sarah; Ferrante, Enzo; Rajchl, Martin; Lee, Matthew; Glocker, Ben; Rueckert, Daniel
2018-04-01
Graph representations are often used to model structured data at an individual or population level and have numerous applications in pattern recognition problems. In the field of neuroscience, where such representations are commonly used to model structural or functional connectivity between a set of brain regions, graphs have proven to be of great importance. This is mainly due to the capability of revealing patterns related to brain development and disease, which were previously unknown. Evaluating similarity between these brain connectivity networks in a manner that accounts for the graph structure and is tailored for a particular application is, however, non-trivial. Most existing methods fail to accommodate the graph structure, discarding information that could be beneficial for further classification or regression analyses based on these similarities. We propose to learn a graph similarity metric using a siamese graph convolutional neural network (s-GCN) in a supervised setting. The proposed framework takes into consideration the graph structure for the evaluation of similarity between a pair of graphs, by employing spectral graph convolutions that allow the generalisation of traditional convolutions to irregular graphs and operates in the graph spectral domain. We apply the proposed model on two datasets: the challenging ABIDE database, which comprises functional MRI data of 403 patients with autism spectrum disorder (ASD) and 468 healthy controls aggregated from multiple acquisition sites, and a set of 2500 subjects from UK Biobank. We demonstrate the performance of the method for the tasks of classification between matching and non-matching graphs, as well as individual subject classification and manifold learning, showing that it leads to significantly improved results compared to traditional methods. Copyright © 2017 Elsevier Inc. All rights reserved.
Assessment of three dead detector correction methods for cone-beam computed tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelms, David W.; Shukla, Hemant I.; Nixon, Earl
Purpose: Dead detectors due to manufacturing defects or radiation damage in the electronic portal imaging devices (EPIDs) used for cone-beam computed tomography (CBCT) can lead to image degradation and ring artifacts. In this work three dead detector correction methods were assessed using megavoltage CBCT (MVCBCT) as a test system, with the goals of assessing the relative effectiveness of the three methods and establishing the conditions for which they fail. Methods: MVCBCT projections acquired with four linacs at 8 and 60 MU (monitor units) were degraded with varying percentages (2%-95%) of randomly distributed dead single detectors (RDSs), randomly distributed dead detectormore » clusters (RDCs) of 2 mm diameter, and nonrandomly distributed dead detector disks (NRDDs) of varying diameter (4-16 mm). Correction algorithms were bidirectional linear interpolation (BLI), quad-directional linear interpolation (QLI), and a Laplacian solution (LS) method. Correction method failure was defined to occur if ring artifacts were present in the reconstructed phantom images from any linac or if the modulation transfer function (MTF) for any linac dropped below baseline with a p value, calculated with the two sample t test, of less than 0.01. Results: All correction methods failed at the same or lower RDC/RDS percentages and NRDD diameters for the 60 MU as for the 8 MU cases. The LS method tended to outperform or match the BLI and QLI methods. If ring artifacts anywhere in the images were considered unacceptable, the LS method failed for 60 MU at >33% RDS, >2% RDC, and >4 mm NRDD. If ring artifacts within 4 mm longitudinally of the phantom section interfaces were considered acceptable, the LS method failed for 60 MU at >90% RDS, >80% RDC, and >4 mm NRDD. LS failed due to MTF drop for 60 MU at >50% RDS, >25% RDC, and >4 mm NRDD. Conclusions: The LS method is superior to the BLI and QLI methods, and correction algorithm effectiveness decreases as imaging dose increases. All correction methods failed first due to ring artifacts and second due to MTF drop. If ring artifacts in axial slices within a 4 mm longitudinal distance from phantom section interfaces are acceptable, statistically significant loss in spatial resolution does not occur until over 25% of the EPID is covered in randomly distributed dead detectors, or NRDDs of 4 mm diameter are present.« less
Nutrition and culture in professional football. A mixed method approach.
Ono, Mutsumi; Kennedy, Eileen; Reeves, Sue; Cronin, Linda
2012-02-01
An adequate diet is essential for the optimal performance of professional football (soccer) players. Existing studies have shown that players fail to consume such a diet, without interrogating the reasons for this. The aim of this study was to explore the difficulties professional football players experience in consuming a diet for optimal performance. It utilized a mixed method approach, combining nutritional intake assessment with qualitative interviews, to ascertain both what was consumed and the wider cultural factors that affect consumption. The study found a high variability in individual intake which ranged widely from 2648 to 4606 kcal/day. In addition, the intake of carbohydrate was significantly lower than that recommended. The study revealed that the main food choices for carbohydrate and protein intake were pasta and chicken respectively. Interview results showed the importance of tradition within the world of professional football in structuring the players' approach to nutrition. In addition, the players' personal eating habits that derived from their class and national habitus restricted their food choice by conflicting with the dietary choices promoted within the professional football clubs. Copyright © 2011 Elsevier Ltd. All rights reserved.
Feuillie, Cécile; Merheb, Maxime M.; Gillet, Benjamin; Montagnac, Gilles; Daniel, Isabelle; Hänni, Catherine
2014-01-01
The analysis of ancient or processed DNA samples is often a great challenge, because traditional Polymerase Chain Reaction – based amplification is impeded by DNA damage. Blocking lesions such as abasic sites are known to block the bypass of DNA polymerases, thus stopping primer elongation. In the present work, we applied the SERRS-hybridization assay, a fully non-enzymatic method, to the detection of DNA refractory to PCR amplification. This method combines specific hybridization with detection by Surface Enhanced Resonant Raman Scattering (SERRS). It allows the detection of a series of double-stranded DNA molecules containing a varying number of abasic sites on both strands, when PCR failed to detect the most degraded sequences. Our SERRS approach can quickly detect DNA molecules without any need for DNA repair. This assay could be applied as a pre-requisite analysis prior to enzymatic reparation or amplification. A whole new set of samples, both forensic and archaeological, could then deliver information that was not yet available due to a high degree of DNA damage. PMID:25502338
Feuillie, Cécile; Merheb, Maxime M; Gillet, Benjamin; Montagnac, Gilles; Daniel, Isabelle; Hänni, Catherine
2014-01-01
The analysis of ancient or processed DNA samples is often a great challenge, because traditional Polymerase Chain Reaction - based amplification is impeded by DNA damage. Blocking lesions such as abasic sites are known to block the bypass of DNA polymerases, thus stopping primer elongation. In the present work, we applied the SERRS-hybridization assay, a fully non-enzymatic method, to the detection of DNA refractory to PCR amplification. This method combines specific hybridization with detection by Surface Enhanced Resonant Raman Scattering (SERRS). It allows the detection of a series of double-stranded DNA molecules containing a varying number of abasic sites on both strands, when PCR failed to detect the most degraded sequences. Our SERRS approach can quickly detect DNA molecules without any need for DNA repair. This assay could be applied as a pre-requisite analysis prior to enzymatic reparation or amplification. A whole new set of samples, both forensic and archaeological, could then deliver information that was not yet available due to a high degree of DNA damage.
Jaeger, Jason O.; Oakley, Paul A.; Moore, Robert R.; Ruggeroli, Edward P.; Harrison, Deed E.
2018-01-01
[Purpose] To present the case of the resolution of right temporomandibular joint dysfunction (TMJD) following the correction of a right lateral head translation posture. [Subject and Methods] A 24 year old female reported facial pain and jaw clicking in the right TMJ. Radiography revealed a 19 mm right head (shift) translation posture. TMJ vibration analysis showed characteristic abnormalities for the right TMJ. The patient was treated with CBP® technique mirror image® left sided exercises, and traction methods as well as spinal manipulative therapy (SMT). [Results] After 36 treatments over a 12-week time period, a complete correction of the lateral head posture was achieved corresponding with a complete resolution of jaw pain and clicking. TMJ vibration analysis demonstrated normal right side TMJ characteristics following treatment. [Conclusion] Abnormal head/neck postures, such as lateral head translation, may be an unrealized source of TMJD and may be explained through the ‘regional interdependence’ model or by how seemingly unrelated anatomy may be associated with a primary complaint. PMID:29410576
Jaeger, Jason O; Oakley, Paul A; Moore, Robert R; Ruggeroli, Edward P; Harrison, Deed E
2018-01-01
[Purpose] To present the case of the resolution of right temporomandibular joint dysfunction (TMJD) following the correction of a right lateral head translation posture. [Subject and Methods] A 24 year old female reported facial pain and jaw clicking in the right TMJ. Radiography revealed a 19 mm right head (shift) translation posture. TMJ vibration analysis showed characteristic abnormalities for the right TMJ. The patient was treated with CBP ® technique mirror image ® left sided exercises, and traction methods as well as spinal manipulative therapy (SMT). [Results] After 36 treatments over a 12-week time period, a complete correction of the lateral head posture was achieved corresponding with a complete resolution of jaw pain and clicking. TMJ vibration analysis demonstrated normal right side TMJ characteristics following treatment. [Conclusion] Abnormal head/neck postures, such as lateral head translation, may be an unrealized source of TMJD and may be explained through the 'regional interdependence' model or by how seemingly unrelated anatomy may be associated with a primary complaint.
The Swiss cheese model of adverse event occurrence--Closing the holes.
Stein, James E; Heiss, Kurt
2015-12-01
Traditional surgical attitude regarding error and complications has focused on individual failings. Human factors research has brought new and significant insights into the occurrence of error in healthcare, helping us identify systemic problems that injure patients while enhancing individual accountability and teamwork. This article introduces human factors science and its applicability to teamwork, surgical culture, medical error, and individual accountability. Copyright © 2015 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Abel, Gillian; Fitzgerald, Lisa
2006-01-01
Traditionally, school-based sex education has provided information-based programmes, with the assumption that young people make rational decisions with regard to the use of condoms. However, these programmes fail to take into account contextual issues and developing subjectivities. This paper presents the talk of 42 young people from a New Zealand…
ERIC Educational Resources Information Center
Hynes, Michelle
2014-01-01
"Don't Call Them Dropouts" adds to the large and growing body of research about why some young people fail to complete high school on the traditional four-year timeline. While a high school diploma is only a starting line for adult success, it has become increasingly clear that it is crucial for taking the next steps in college and…
Survival through Adaptation: The Chinese Red Army and the Encirclement Campaigns, 1927-1936
2012-06-08
the Soviet Union and communism shifted after his suppression of a communist 99For more information ...other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a ...period. The movement encompassed not only reforms, but also a fundamental shift from traditional Chinese culture and norms. For more information on the
Jet Engines - The New Masters of Advanced Flight Control
NASA Astrophysics Data System (ADS)
Gal-Or, Benjamin
2018-05-01
ANTICIPATED UNITED STATES CONGRESS ACT should lead to reversing a neglected duty to the people by supporting FAA induced bill to civilize classified military air combat technology to maximize flight safety of airliners and cargo jet transports, in addition to FAA certifying pilots to master Jet-Engine Steering ("JES") as automatic or pilot recovery when Traditional Aerodynamic-only Flight Control ("TAFC") fails to prevent a crash and other related damages
Operational Risk and the American Way of Warfare
2011-12-01
tactical level. That conclusion however, fails to account for the entire context. The cumulative effect is one of operational complacency. The...largely come at the cost of any operational thinking about risk. The operational risk in the current fight is not easily discernible because it seems...traditional American answers to risk. This addiction to annihilation through firepower has come with a high cost . The culture of annihilation through
ERIC Educational Resources Information Center
Hughes, John; Zhou, Chengfu; Petscher, Yaacov
2015-01-01
This report describes the results of a REL Southeast study comparing student success in online credit recovery and general courses taken online compared to traditional face-to-face courses. Credit recovery occurs when a student fails a course and then retakes the same course to earn high school credit. This research question was motivated by the…
Wilson, M R; Zimmermann, L L; Crawford, E D; Sample, H A; Soni, P R; Baker, A N; Khan, L M; DeRisi, J L
2017-03-01
Solid organ transplant patients are vulnerable to suffering neurologic complications from a wide array of viral infections and can be sentinels in the population who are first to get serious complications from emerging infections like the recent waves of arboviruses, including West Nile virus, Chikungunya virus, Zika virus, and Dengue virus. The diverse and rapidly changing landscape of possible causes of viral encephalitis poses great challenges for traditional candidate-based infectious disease diagnostics that already fail to identify a causative pathogen in approximately 50% of encephalitis cases. We present the case of a 14-year-old girl on immunosuppression for a renal transplant who presented with acute meningoencephalitis. Traditional diagnostics failed to identify an etiology. RNA extracted from her cerebrospinal fluid was subjected to unbiased metagenomic deep sequencing, enhanced with the use of a Cas9-based technique for host depletion. This analysis identified West Nile virus (WNV). Convalescent serum serologies subsequently confirmed WNV seroconversion. These results support a clear clinical role for metagenomic deep sequencing in the setting of suspected viral encephalitis, especially in the context of the high-risk transplant patient population. © 2016 The Authors. American Journal of Transplantation published by Wiley Periodicals, Inc. on behalf of American Society of Transplant Surgeons.
Beiersmann, Claudia; Sanou, Aboubakary; Wladarsch, Evelyn; De Allegri, Manuela; Kouyaté, Bocar; Müller, Olaf
2007-08-08
The literature on health care seeking behaviour in sub-Saharan Africa for children suffering from malaria is quite extensive. This literature, however, is predominantly quantitative and, inevitably, fails to explore how the local concepts of illness may affect people's choices. Understanding local concepts of illness and their influence on health care-seeking behaviour can complement existing knowledge and lead to the development of more effective malaria control interventions. In a rural area of Burkina Faso, four local concepts of illness resembling the biomedical picture of malaria were described according to symptoms, aetiology, and treatment. Data were collected through eight focus group discussions, 17 semi-structured interviews with key informants, and through the analysis of 100 verbal autopsy questionnaires of children under-five diagnosed with malaria. Sumaya, dusukun yelema, kono, and djoliban were identified as the four main local illness concepts resembling respectively uncomplicated malaria, respiratory distress syndrome, cerebral malaria, and severe anaemia. The local disease categorization was found to affect both treatment and provider choice. While sumaya is usually treated by a mix of traditional and modern methods, dusukun yelema and kono are preferably treated by traditional healers, and djoliban is preferably treated in modern health facilities. Besides the conceptualization of illness, poverty was found to be another important influencing factor of health care-seeking behaviour. The findings complement previous evidence on health care-seeking behaviour, by showing how local concepts of illness strongly influence treatment and choice of provider. Local concepts of illness need to be considered when developing specific malaria control programmes.
Saruta, Masayuki; Papadakis, Konstantinos A
2009-01-01
Wireless capsule endoscopy (WCE) has emerged as an important diagnostic tool for the evaluation of patients with suspected small intestinal (SI) disease, including obscure gastrointestinal bleeding, Crohn's disease (CD), malabsorptive disorders and SI tumors. Since a great number of patients with CD have small-bowel (SB) involvement, it is important for newly diagnosed patients to undergo an evaluation of the SB, which has traditionally been performed using a radiographic study such as a SB follow-through. The greatest utility of WCE in the evaluation of SB CD has been observed in cases of suspected CD, where the initial evaluation with upper and lower endoscopy as well as traditional radiographic techniques have failed to establish the diagnosis. WCE can detect SB involvement in CD, particularly early lesions that can be overlooked by traditional radiological studies. The sensitivity of diagnosing SB CD by WCE is superior to other endoscopic or radiological methods such as push enteroscopy, computed tomography or magnetic resonance enteroclysis. The utility of WCE in patients with known CD, indeterminate colitis and a select group of patients with ulcerative colitis can help to better define the diagnosis and extent of the disease, and assist in the management of patients with persistent symptoms. A disadvantage of WCE is that the device may be retained in a strictured area of the SB, which may often be present in patients with CD, in addition to a lower specificity. WCE may replace classical studies and become the gold standard for diagnosing SB involvement in patients with suspected, or known CD, in the absence of strictures and fistulae.
Rural women in Africa and technological change: some issues.
Date-bah, E; Stevens, Y
1981-01-01
The attempt is made in this discussion to highlight some of the important sociological and technical issues relating to rural women in Africa and technological change which appear to have been underplayed, misconceived or overlooked in the past. Attention is directed to the rural woman as a member of the family unit, the image of the rural man, rural women as a diversified group, community and national governmental commitment to rural technology innovations, the use of already existing traditional groups and institutions to effect rural technological change, and design specifications and shortcomings of equipment and tools (manufacturing costs, exploitation of locally available energy resources, the simplicity of the devices), and infrastructural and marketing problems. Numberous projects aimed at improving the lot of women in the rural areas have focused only on women, rather than the woman as a member of an extended as well as a nuclear family unit. Consequently, they have failed, for rural women do not exist or operate in isolation. It is difficult to believe the overall image in much of the literature that the husbands of rural women show no sympathy or regard for their wives. In the effort to attract investment to improve upon the position of rural women, reality should not be distorted with this one-sided view. Men should be involved in the technology planned for rural women, and the technological change should be planned and implemented in such a way that it results in an improvement in the relationship between the rural couple and generally between members of the rural family and between males and females in the village. Another problem is overgeneralization, and it must be recognized that considerable differentiation exists between rural women themselves. The importance of community, governmental and political commitment to rural technology innovations in order to ensure their success is neglected in the literature. The government and polictical leadership can do much to introduce improved technologies in the rural areas. The use of existing traditional institutions to bring about tecnological change in the rural areas needs to be stressed. Primary reasons why some of the improved devices introduced for use by rural women have been rejected include the following: the devices fail to meet the priority needs of the women and socioeconomic and cultural factors are not considered in their design. Most developing countries are without the required industries to produce the needed basic components. Exploitation of some of the available natural resources would make life much easier for rural women. As rural societies are usually imperfectly linked to bacly organized markets, infrastructural facilities, such as feeder roads, would have to be improved. The following are among the hypotheses suggested by this review: technological innovations linked to existing traditional skills and methods are likely to have easier acceptance in the rural areas than those divorced from these skills and methods; and technological innovations which are disseminated through existing traditional institutions and groups are likely to have easier acceptance. Guidelines for future research are included.
Application of phyto-indication and radiocesium indicative methods for microrelief mapping
NASA Astrophysics Data System (ADS)
Panidi, E.; Trofimetz, L.; Sokolova, J.
2016-04-01
Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.
Complementary Health Approaches Used in the Intensive Care Unit.
Erdoğan, Zeynep; Atik, Derya
Intensive care units are care centers where, in order to provide the maximum benefit to individuals whose life is in danger, many lifesaving technological tools and devices are present, and morbidity and mortality rates are high. In the intensive care unit, when classic treatments fail or become unbearable because of side effects, complementary methods have been suggested to be the best alternative. Complementary health approaches are methods that are used both for the continuation and the improvement of the well-being of an individual and as additions to medical treatments that are based on a holistic approach. These applications are especially helpful in the treatment of the stresses, anxieties, and other symptoms of unstable patients in the intensive care unit who do not tolerate traditional treatment methods well, increasing their psychological and physiological well-being, helping them sleep and rest. In intensive care patients, in order to decrease the incidence of postoperative atrial fibrillation, antiemetic and medicine needs, mechanical ventilation duration, and the intensity of the disease as well as to cope with symptoms such as pain, anxiety, physiological parameters, dyspnea, and sleep problems, body-mind interventions such as massage, reflexology, acupressure, aromatherapy, music therapy, energy therapies (healing touch, therapeutic touch, the Yakson method), and prayer are used as complementary health approaches.
Sparse deconvolution for the large-scale ill-posed inverse problem of impact force reconstruction
NASA Astrophysics Data System (ADS)
Qiao, Baijie; Zhang, Xingwu; Gao, Jiawei; Liu, Ruonan; Chen, Xuefeng
2017-01-01
Most previous regularization methods for solving the inverse problem of force reconstruction are to minimize the l2-norm of the desired force. However, these traditional regularization methods such as Tikhonov regularization and truncated singular value decomposition, commonly fail to solve the large-scale ill-posed inverse problem in moderate computational cost. In this paper, taking into account the sparse characteristic of impact force, the idea of sparse deconvolution is first introduced to the field of impact force reconstruction and a general sparse deconvolution model of impact force is constructed. Second, a novel impact force reconstruction method based on the primal-dual interior point method (PDIPM) is proposed to solve such a large-scale sparse deconvolution model, where minimizing the l2-norm is replaced by minimizing the l1-norm. Meanwhile, the preconditioned conjugate gradient algorithm is used to compute the search direction of PDIPM with high computational efficiency. Finally, two experiments including the small-scale or medium-scale single impact force reconstruction and the relatively large-scale consecutive impact force reconstruction are conducted on a composite wind turbine blade and a shell structure to illustrate the advantage of PDIPM. Compared with Tikhonov regularization, PDIPM is more efficient, accurate and robust whether in the single impact force reconstruction or in the consecutive impact force reconstruction.
Gálvez, Akemi; Iglesias, Andrés
2013-01-01
Fitting spline curves to data points is a very important issue in many applied fields. It is also challenging, because these curves typically depend on many continuous variables in a highly interrelated nonlinear way. In general, it is not possible to compute these parameters analytically, so the problem is formulated as a continuous nonlinear optimization problem, for which traditional optimization techniques usually fail. This paper presents a new bioinspired method to tackle this issue. In this method, optimization is performed through a combination of two techniques. Firstly, we apply the indirect approach to the knots, in which they are not initially the subject of optimization but precomputed with a coarse approximation scheme. Secondly, a powerful bioinspired metaheuristic technique, the firefly algorithm, is applied to optimization of data parameterization; then, the knot vector is refined by using De Boor's method, thus yielding a better approximation to the optimal knot vector. This scheme converts the original nonlinear continuous optimization problem into a convex optimization problem, solved by singular value decomposition. Our method is applied to some illustrative real-world examples from the CAD/CAM field. Our experimental results show that the proposed scheme can solve the original continuous nonlinear optimization problem very efficiently.
Gálvez, Akemi; Iglesias, Andrés
2013-01-01
Fitting spline curves to data points is a very important issue in many applied fields. It is also challenging, because these curves typically depend on many continuous variables in a highly interrelated nonlinear way. In general, it is not possible to compute these parameters analytically, so the problem is formulated as a continuous nonlinear optimization problem, for which traditional optimization techniques usually fail. This paper presents a new bioinspired method to tackle this issue. In this method, optimization is performed through a combination of two techniques. Firstly, we apply the indirect approach to the knots, in which they are not initially the subject of optimization but precomputed with a coarse approximation scheme. Secondly, a powerful bioinspired metaheuristic technique, the firefly algorithm, is applied to optimization of data parameterization; then, the knot vector is refined by using De Boor's method, thus yielding a better approximation to the optimal knot vector. This scheme converts the original nonlinear continuous optimization problem into a convex optimization problem, solved by singular value decomposition. Our method is applied to some illustrative real-world examples from the CAD/CAM field. Our experimental results show that the proposed scheme can solve the original continuous nonlinear optimization problem very efficiently. PMID:24376380
GPU-accelerated Kernel Regression Reconstruction for Freehand 3D Ultrasound Imaging.
Wen, Tiexiang; Li, Ling; Zhu, Qingsong; Qin, Wenjian; Gu, Jia; Yang, Feng; Xie, Yaoqin
2017-07-01
Volume reconstruction method plays an important role in improving reconstructed volumetric image quality for freehand three-dimensional (3D) ultrasound imaging. By utilizing the capability of programmable graphics processing unit (GPU), we can achieve a real-time incremental volume reconstruction at a speed of 25-50 frames per second (fps). After incremental reconstruction and visualization, hole-filling is performed on GPU to fill remaining empty voxels. However, traditional pixel nearest neighbor-based hole-filling fails to reconstruct volume with high image quality. On the contrary, the kernel regression provides an accurate volume reconstruction method for 3D ultrasound imaging but with the cost of heavy computational complexity. In this paper, a GPU-based fast kernel regression method is proposed for high-quality volume after the incremental reconstruction of freehand ultrasound. The experimental results show that improved image quality for speckle reduction and details preservation can be obtained with the parameter setting of kernel window size of [Formula: see text] and kernel bandwidth of 1.0. The computational performance of the proposed GPU-based method can be over 200 times faster than that on central processing unit (CPU), and the volume with size of 50 million voxels in our experiment can be reconstructed within 10 seconds.
Gureje, Oye; Nortje, Gareth; Makanjuola, Victor; Oladeji, Bibilola D; Seedat, Soraya; Jenkins, Rachel
2015-02-01
Traditional and complementary systems of medicine include a broad range of practices, which are commonly embedded in cultural milieus and reflect community beliefs, experiences, religion, and spirituality. Two major components of this system are discernible: complementary alternative medicine and traditional medicine, with different clientele and correlates of patronage. Evidence from around the world suggests that a traditional or complementary system of medicine is commonly used by a large number of people with mental illness. Practitioners of traditional medicine in low-income and middle-income countries fill a major gap in mental health service delivery. Although some overlap exists in the diagnostic approaches of traditional and complementary systems of medicine and conventional biomedicine, some major differences exist, largely in the understanding of the nature and cause of mental disorders. Treatments used by providers of traditional and complementary systems of medicine, especially traditional and faith healers in low-income and middle-income countries, might sometimes fail to meet widespread understandings of human rights and humane care. Nevertheless, collaborative engagement between traditional and complementary systems of medicine and conventional biomedicine might be possible in the care of people with mental illness. The best model to bring about that collaboration will need to be established by the needs of the extant mental health system in a country. Research is needed to provide an empirical basis for the feasibility of such collaboration, to clearly delineate its boundaries, and to test its effectiveness in bringing about improved patient outcomes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Estimating Measures of Pass-Fail Reliability from Parallel Half-Tests.
ERIC Educational Resources Information Center
Woodruff, David J.; Sawyer, Richard L.
Two methods for estimating measures of pass-fail reliability are derived, by which both theta and kappa may be estimated from a single test administration. The methods require only a single test administration and are computationally simple. Both are based on the Spearman-Brown formula for estimating stepped-up reliability. The non-distributional…
Optimizing spacecraft design - optimization engine development : progress and plans
NASA Technical Reports Server (NTRS)
Cornford, Steven L.; Feather, Martin S.; Dunphy, Julia R; Salcedo, Jose; Menzies, Tim
2003-01-01
At JPL and NASA, a process has been developed to perform life cycle risk management. This process requires users to identify: goals and objectives to be achieved (and their relative priorities), the various risks to achieving those goals and objectives, and options for risk mitigation (prevention, detection ahead of time, and alleviation). Risks are broadly defined to include the risk of failing to design a system with adequate performance, compatibility and robustness in addition to more traditional implementation and operational risks. The options for mitigating these different kinds of risks can include architectural and design choices, technology plans and technology back-up options, test-bed and simulation options, engineering models and hardware/software development techniques and other more traditional risk reduction techniques.
CNV-TV: a robust method to discover copy number variation from short sequencing reads.
Duan, Junbo; Zhang, Ji-Gang; Deng, Hong-Wen; Wang, Yu-Ping
2013-05-02
Copy number variation (CNV) is an important structural variation (SV) in human genome. Various studies have shown that CNVs are associated with complex diseases. Traditional CNV detection methods such as fluorescence in situ hybridization (FISH) and array comparative genomic hybridization (aCGH) suffer from low resolution. The next generation sequencing (NGS) technique promises a higher resolution detection of CNVs and several methods were recently proposed for realizing such a promise. However, the performances of these methods are not robust under some conditions, e.g., some of them may fail to detect CNVs of short sizes. There has been a strong demand for reliable detection of CNVs from high resolution NGS data. A novel and robust method to detect CNV from short sequencing reads is proposed in this study. The detection of CNV is modeled as a change-point detection from the read depth (RD) signal derived from the NGS, which is fitted with a total variation (TV) penalized least squares model. The performance (e.g., sensitivity and specificity) of the proposed approach are evaluated by comparison with several recently published methods on both simulated and real data from the 1000 Genomes Project. The experimental results showed that both the true positive rate and false positive rate of the proposed detection method do not change significantly for CNVs with different copy numbers and lengthes, when compared with several existing methods. Therefore, our proposed approach results in a more reliable detection of CNVs than the existing methods.
Establishing pass/fail criteria for bronchoscopy performance.
Konge, Lars; Clementsen, Paul; Larsen, Klaus Richter; Arendrup, Henrik; Buchwald, Christian; Ringsted, Charlotte
2012-01-01
Several tools have been created to assess competence in bronchoscopy. However, educational guidelines still use an arbitrary number of performed procedures to decide when basic competency is acquired. The purpose of this study was to define pass/fail scores for two bronchoscopy assessment tools, and investigate how these scores relate to physicians' experience regarding the number of bronchoscopy procedures performed. We studied two assessment tools and used two standard setting methods to create cut scores: the contrasting-groups method and the extended Angoff method. In the first we compared bronchoscopy performance scores of 14 novices with the scores of 14 experienced consultants to find the score that best discriminated between the two groups. In the second we asked an expert group of 7 experienced bronchoscopists to judge how a borderline trainee would perform on each item of the test. Using the contrasting-groups method we found a standard that would fail all novices and pass all consultants. A clear pass related to prior experience of 75 procedures. The consequences of using the extended Angoff method were also acceptable: all trainees who had performed less than 50 bronchoscopies failed the test and all consultants passed. A clear pass related to 80 procedures. Our proposed pass/fail scores for these two methods seem appropriate in terms of consequences. Prior experience with the performance of 75 and 80 bronchoscopies, respectively, seemed to ensure basic competency. In the future objective assessment tools could become an important aid in the certification of physicians performing bronchoscopies. Copyright © 2011 S. Karger AG, Basel.
Systems and methods for circuit lifetime evaluation
NASA Technical Reports Server (NTRS)
Heaps, Timothy L. (Inventor); Sheldon, Douglas J. (Inventor); Bowerman, Paul N. (Inventor); Everline, Chester J. (Inventor); Shalom, Eddy (Inventor); Rasmussen, Robert D. (Inventor)
2013-01-01
Systems and methods for estimating the lifetime of an electrical system in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes iteratively performing Worst Case Analysis (WCA) on a system design with respect to different system lifetimes using a computer to determine the lifetime at which the worst case performance of the system indicates the system will pass with zero margin or fail within a predetermined margin for error given the environment experienced by the system during its lifetime. In addition, performing WCA on a system with respect to a specific system lifetime includes identifying subcircuits within the system, performing Extreme Value Analysis (EVA) with respect to each subcircuit to determine whether the subcircuit fails EVA for the specific system lifetime, when the subcircuit passes EVA, determining that the subcircuit does not fail WCA for the specified system lifetime, when a subcircuit fails EVA performing at least one additional WCA process that provides a tighter bound on the WCA than EVA to determine whether the subcircuit fails WCA for the specified system lifetime, determining that the system passes WCA with respect to the specific system lifetime when all subcircuits pass WCA, and determining that the system fails WCA when at least one subcircuit fails WCA.
Quali-quantitative analysis (QQA): why it could open new frontiers for holistic health practice.
Bell, Erica
2006-12-15
Holistic health practice is often described as being about understanding the larger contexts of patients, their health services, and their communities. Yet do traditional quantitative and qualitative health research methods produce the best possible evidence for the holistic practices of doctors, nurses, and allied health professionals? This paper argues "no", and examines the potential of a cutting-edge, social science research method--Quali-Quantitative Research (QQA)--for providing better evidence for holistic practice, particularly in small-N populations, such as rural and remote communities. It does so with reference to the international literature on holistic medicine, as well as three holistic health projects conducted in Tasmania: about prevention of falls in older people, adolescent substance abuse, and interventions for children aged 0-5 exposed to domestic violence. The findings suggest that much health research fails to capture rigorously the contextual complexity of holistic health challenges: the multiple different needs of individual patients, and the interprofessional approaches needed to deliver multidisciplinary and multiservice health interventions tailored to meet those needs in particular community contexts. QQA offers a "configurational", case-based, diversity-oriented approach to analysing data that combines qualitative and quantitative techniques to overcome the limitations of both research traditions. The author concludes that QQA could open new frontiers for holistic health by helping doctors, nurses, and allied health professionals answer a fundamental question presented by complex health challenges: "Given this set of whole-of-patient needs, what elements of which interventions in what services would work best in this particular community?"
Scoring and setting pass/fail standards for an essay certification examination in nurse-midwifery.
Fullerton, J T; Greener, D L; Gross, L J
1992-03-01
Examination for certification or licensure of health professionals (credentialing) in the United States is almost exclusively of the multiple choice format. The certification examination for entry into the practice of the profession of nurse-midwifery has, however, used a modified essay format throughout its twenty-year history. The examination has recently undergone a revision in the method for score interpretation and for pass/fail decision-making. The revised method, described in this paper, has important implications for all health professional credentialing agencies which use modified essay, oral or practical methods of competency assessment. This paper describes criterion-referenced scoring, the process of constructing the essay items, the methods for assuring validity and reliability for the examination, and the manner of standard setting. In addition, two alternative methods for increasing the validity of the pass/fail decision are evaluated, and the rationale for decision-making about marginal candidates is described.
Evaluation Measures and Methods: Some Intersections.
ERIC Educational Resources Information Center
Elliott, John
The literature is reviewed for four combinations of evaluation measures and methods: traditional methods with traditional measures (T-Meth/T-Mea), nontraditional methods with traditional measures (N-Meth/T-Mea), traditional measures with nontraditional measures (T-Meth/N-Mea), and nontraditional methods with nontraditional measures (N-Meth/N-Mea).…
NASA Astrophysics Data System (ADS)
Burns, Dana
Over the last two decades, online education has become a popular concept in universities as well as K-12 education. This generation of students has grown up using technology and has shown interest in incorporating technology into their learning. The idea of using technology in the classroom to enhance student learning and create higher achievement has become necessary for administrators, teachers, and policymakers. Although online education is a popular topic, there has been minimal research on the effectiveness of online and blended learning strategies compared to the student learning in a traditional K-12 classroom setting. The purpose of this study was to investigate differences in standardized test scores from the Biology End of Course exam when at-risk students completed the course using three different educational models: online format, blended learning, and traditional face-to-face learning. Data was collected from over 1,000 students over a five year time period. Correlation analyzed data from standardized tests scores of eighth grade students was used to define students as "at-risk" for failing high school courses. The results indicated a high correlation between eighth grade standardized test scores and Biology End of Course exam scores. These students were deemed "at-risk" for failing high school courses. Standardized test scores were measured for the at-risk students when those students completed Biology in the different models of learning. Results indicated significant differences existed among the learning models. Students had the highest test scores when completing Biology in the traditional face-to-face model. Further evaluation of subgroup populations indicated statistical differences in learning models for African-American populations, female students, and for male students.
Bridge of Steel, US Merchant Shipping in World War II
2010-12-02
military equipment.88 The British entered the war earlier then the US and failed to maintain a foothold on the European continent. At Dunkirk in...1940, the British experienced the need for a vessel that could load and unload from unimproved beaches. Dunkirk lacked the port facilities traditionally...possession due to the absences of a port to load equipment. The Dunkirk experience, and observing Japanese exercises, led the British specialized vessels
Risky Business: The Global Threat Network and the Politics of Contraband
2014-05-01
of the traditional pillars of the anti – money laundering platform known as Know Your Customer (KYC), which requires due diligence on heightened-risk... money in the illicit market also help to protect those gains from others. It is also important to recognize that many criminals have only modest control...wisdom suggests that criminal -terrorist connectivity is a phenomenon found in failed and economically poor states. This argument relies on four
ERIC Educational Resources Information Center
Goldberger, Susan
2008-01-01
One of the most persistent inequities in U.S. education is the gap in math achievement along income and race lines. Yet some secondary schools beat the odds, producing consistently strong math performance with students who likely would fail in traditional settings. This report advocates that the math achievement gap is not the result of poor and…
NASA Astrophysics Data System (ADS)
Xu, Yu-Lin
The problem of computing the orbit of a visual binary from a set of observed positions is reconsidered. It is a least squares adjustment problem, if the observational errors follow a bias-free multivariate Gaussian distribution and the covariance matrix of the observations is assumed to be known. The condition equations are constructed to satisfy both the conic section equation and the area theorem, which are nonlinear in both the observations and the adjustment parameters. The traditional least squares algorithm, which employs condition equations that are solved with respect to the uncorrelated observations and either linear in the adjustment parameters or linearized by developing them in Taylor series by first-order approximation, is inadequate in our orbit problem. D.C. Brown proposed an algorithm solving a more general least squares adjustment problem in which the scalar residual function, however, is still constructed by first-order approximation. Not long ago, a completely general solution was published by W.H Jefferys, who proposed a rigorous adjustment algorithm for models in which the observations appear nonlinearly in the condition equations and may be correlated, and in which construction of the normal equations and the residual function involves no approximation. This method was successfully applied in our problem. The normal equations were first solved by Newton's scheme. Practical examples show that this converges fast if the observational errors are sufficiently small and the initial approximate solution is sufficiently accurate, and that it fails otherwise. Newton's method was modified to yield a definitive solution in the case the normal approach fails, by combination with the method of steepest descent and other sophisticated algorithms. Practical examples show that the modified Newton scheme can always lead to a final solution. The weighting of observations, the orthogonal parameters and the efficiency of a set of adjustment parameters are also considered. The definition of efficiency is revised.
Zhou, Ronggang; Chan, Alan H. S.
2016-01-01
BACKGROUND: In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. OBJECTIVE: This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. METHODS: With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. RESULTS AND CONCLUSIONS: Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process. PMID:28035943
Zhang, Min-Jian
2017-07-01
Researches on the mechanisms underlying the therapeutic effects of the disease-syndrome combination approach in integrated traditional Chinese and Western medicine are becoming a hot spot in andrology, but many recent studies of this kind have failed to explain the connotation of integrated traditional Chinese and Western medicine in andrology. Related existing problems include repeated researches into the same indexes of action mechanisms of different therapeutic principles of traditional Chinese medicine (TCM), Chinese herbal compound and special prescriptions, studies focusing on individual diseases but ignoring symptoms, immature syndrome models for studies of mechanisms, and too much attention to uncertain or immature target mechanisms. The stress should be placed on the action mechanisms of Chinese herbal compound and special prescriptions on male diseases and, what is more important, on the clarification of the essential principles of differentiation and treatment of TCM syndromes. In the recent years, proteomics, genomics, transcriptomics and metabolomics have shed some light upon researches into the mechanisms underlying the therapeutic effects of the disease-syndrome combination approach in integrated traditional Chinese and Western medicine in andrology. An insight into the TCM syndrome, a macroscopic inductive analysis, and a comprehension of such microcosmic aspects as the gene, protein, metabolism and metagenome may contribute to some breakthroughs and new ideas in the studies of disease-syndrome combination in integrated traditional Chinese and Western medicine in andrology.
Arthrodesis of the knee after failed knee replacement.
Wade, P J; Denham, R A
1984-05-01
Arthrodesis of the knee is sometimes needed for failed total knee replacement, but fusion can be difficult to obtain. We describe a method of arthrodesis that uses the simple, inexpensive, Portsmouth external fixator. Bony union was obtained in all six patients treated with this technique. These results are compared with those obtained by other methods of arthrodesis.
Hartzell, S.; Liu, P.
1996-01-01
A method is presented for the simultaneous calculation of slip amplitudes and rupture times for a finite fault using a hybrid global search algorithm. The method we use combines simulated annealing with the downhill simplex method to produce a more efficient search algorithm then either of the two constituent parts. This formulation has advantages over traditional iterative or linearized approaches to the problem because it is able to escape local minima in its search through model space for the global optimum. We apply this global search method to the calculation of the rupture history for the Landers, California, earthquake. The rupture is modeled using three separate finite-fault planes to represent the three main fault segments that failed during this earthquake. Both the slip amplitude and the time of slip are calculated for a grid work of subfaults. The data used consist of digital, teleseismic P and SH body waves. Long-period, broadband, and short-period records are utilized to obtain a wideband characterization of the source. The results of the global search inversion are compared with a more traditional linear-least-squares inversion for only slip amplitudes. We use a multi-time-window linear analysis to relax the constraints on rupture time and rise time in the least-squares inversion. Both inversions produce similar slip distributions, although the linear-least-squares solution has a 10% larger moment (7.3 ?? 1026 dyne-cm compared with 6.6 ?? 1026 dyne-cm). Both inversions fit the data equally well and point out the importance of (1) using a parameterization with sufficient spatial and temporal flexibility to encompass likely complexities in the rupture process, (2) including suitable physically based constraints on the inversion to reduce instabilities in the solution, and (3) focusing on those robust rupture characteristics that rise above the details of the parameterization and data set.
NASA Astrophysics Data System (ADS)
Wang, Kai-Ping
2012-07-01
Purpose: The purpose of this study was to determine the effectiveness of competitive Student Team Achievement Division (STAD), non-competitive STAD, and traditional learning on chemistry learning and learning perceptions. Sample, design and methods: By adopting the STAD approach, this study examined 144 nursing students at a five-year junior college in northern Taiwan during the first semester (totaling 18 weeks) of the 2008 academic year. Results: The findings reveal that both a heterogeneous group with external pressure (involving competitive STAD) and a friendship group with affective pressure (involving traditional learning) enhance group cohesion and assist students' meaningful learning; the heterogeneous group without extra pressure (involving non-competitive STAD), by contrast, fails because of apathy and lassitude. Moreover, learning effectiveness will obviously predominate until the learning strategy continues for a long period or at least one semester. Conclusions: This study revealed that the learning performance level of the competitive STAD group is significantly different from that of the non-competitive STAD group; and the learning performance level of the traditional group is significantly different from that of the non-competitive STAD group. Both the competitive STAD group and traditional group of medium ability students are significantly different from the non-competitive STAD group. Low-ability students from the competitive STAD group are significantly different from those of the non-competitive STAD, though no significant differences were found in learning perception. However, both a lack of friendship and a lack of ability in using algorithms may affect students' chemistry learning. Furthermore, gender imbalance, educational culture, and group emotions are factors that may influence student learning performance. Further study should focus on the use of grouping, improve responsibility in group discussion, and investigate group interaction patterns to determine the factors that influence learning performance of students working in groups.
Clinical review: improving the measurement of serum thyroglobulin with mass spectrometry.
Hoofnagle, Andrew N; Roth, Mara Y
2013-04-01
Serum thyroglobulin (Tg) measurements are central to the management of patients treated for differentiated thyroid carcinoma. For decades, Tg measurements have relied on methods that are subject to interference by commonly found substances in human serum and plasma, such as Tg autoantibodies. As a result, many patients need additional imaging studies to rule out cancer persistence or recurrence that could be avoided with more sensitive and specific testing methods. The aims of this review are to: 1) briefly review the interferences common to Tg immunoassays; 2) introduce readers to liquid chromatography-tandem mass spectrometry as a method for quantifying proteins in human serum/plasma; and 3) discuss the potential benefits and limitations of the method in the quantification of serum Tg. Mass spectrometric methods have traditionally lacked the sensitivity, robustness, and throughput to be useful clinical assays. These methods failed to meet the necessary clinical benchmarks due to the nature of the mass spectrometry workflow and instrumentation. Over the past few years, there have been major advances in reagents, automation, and instrumentation for the quantification of proteins using mass spectrometry. More recently, methods using mass spectrometry to detect and quantify Tg have been developed and are of sufficient quality to be used in the management of patients. Novel serum Tg assays that use mass spectrometry may avoid the issue of autoantibody interference and other problems with currently available immunoassays for Tg. Prospective studies are needed to fully understand the potential benefits of novel Tg assays to patients and care providers.
USDA-ARS?s Scientific Manuscript database
Whether a required Salmonella test series is passed or failed depends not only on the presence of the bacteria, but also on the methods for taking samples, the methods for culturing samples, and the statistics associated with the sampling plan. The pass-fail probabilities of the two-class attribute...
Organism-level models: When mechanisms and statistics fail us
NASA Astrophysics Data System (ADS)
Phillips, M. H.; Meyer, J.; Smith, W. P.; Rockhill, J. K.
2014-03-01
Purpose: To describe the unique characteristics of models that represent the entire course of radiation therapy at the organism level and to highlight the uses to which such models can be put. Methods: At the level of an organism, traditional model-building runs into severe difficulties. We do not have sufficient knowledge to devise a complete biochemistry-based model. Statistical model-building fails due to the vast number of variables and the inability to control many of them in any meaningful way. Finally, building surrogate models, such as animal-based models, can result in excluding some of the most critical variables. Bayesian probabilistic models (Bayesian networks) provide a useful alternative that have the advantages of being mathematically rigorous, incorporating the knowledge that we do have, and being practical. Results: Bayesian networks representing radiation therapy pathways for prostate cancer and head & neck cancer were used to highlight the important aspects of such models and some techniques of model-building. A more specific model representing the treatment of occult lymph nodes in head & neck cancer were provided as an example of how such a model can inform clinical decisions. A model of the possible role of PET imaging in brain cancer was used to illustrate the means by which clinical trials can be modelled in order to come up with a trial design that will have meaningful outcomes. Conclusions: Probabilistic models are currently the most useful approach to representing the entire therapy outcome process.
Carey, Rachel; Caraher, Martin; Lawrence, Mark; Friel, Sharon
2016-01-01
The present article tracks the development of the Australian National Food Plan as a 'whole of government' food policy that aimed to integrate elements of nutrition and sustainability alongside economic objectives. The article uses policy analysis to explore the processes of consultation and stakeholder involvement in the development of the National Food Plan, focusing on actors from the sectors of industry, civil society and government. Existing documentation and submissions to the Plan were used as data sources. Models of health policy analysis and policy streams were employed to analyse policy development processes. Australia. Australian food policy stakeholders. The development of the Plan was influenced by powerful industry groups and stakeholder engagement by the lead ministry favoured the involvement of actors representing the food and agriculture industries. Public health nutrition and civil society relied on traditional methods of policy influence, and the public health nutrition movement failed to develop a unified cross-sector alliance, while the private sector engaged in different ways and presented a united front. The National Food Plan failed to deliver an integrated food policy for Australia. Nutrition and sustainability were effectively sidelined due to the focus on global food production and positioning Australia as a food 'superpower' that could take advantage of the anticipated 'dining boom' as incomes rose in the Asia-Pacific region. New forms of industry influence are emerging in the food policy arena and public health nutrition will need to adopt new approaches to influencing public policy.
NASA Astrophysics Data System (ADS)
Li, Hui; Yu, Jun-Ling; Yu, Le-An; Sun, Jie
2014-05-01
Case-based reasoning (CBR) is one of the main forecasting methods in business forecasting, which performs well in prediction and holds the ability of giving explanations for the results. In business failure prediction (BFP), the number of failed enterprises is relatively small, compared with the number of non-failed ones. However, the loss is huge when an enterprise fails. Therefore, it is necessary to develop methods (trained on imbalanced samples) which forecast well for this small proportion of failed enterprises and performs accurately on total accuracy meanwhile. Commonly used methods constructed on the assumption of balanced samples do not perform well in predicting minority samples on imbalanced samples consisting of the minority/failed enterprises and the majority/non-failed ones. This article develops a new method called clustering-based CBR (CBCBR), which integrates clustering analysis, an unsupervised process, with CBR, a supervised process, to enhance the efficiency of retrieving information from both minority and majority in CBR. In CBCBR, various case classes are firstly generated through hierarchical clustering inside stored experienced cases, and class centres are calculated out by integrating cases information in the same clustered class. When predicting the label of a target case, its nearest clustered case class is firstly retrieved by ranking similarities between the target case and each clustered case class centre. Then, nearest neighbours of the target case in the determined clustered case class are retrieved. Finally, labels of the nearest experienced cases are used in prediction. In the empirical experiment with two imbalanced samples from China, the performance of CBCBR was compared with the classical CBR, a support vector machine, a logistic regression and a multi-variant discriminate analysis. The results show that compared with the other four methods, CBCBR performed significantly better in terms of sensitivity for identifying the minority samples and generated high total accuracy meanwhile. The proposed approach makes CBR useful in imbalanced forecasting.
ERIC Educational Resources Information Center
Lottero-Perdue, Pamela S.; Parry, Elizabeth A.
2017-01-01
This mixed-methods study examines how teachers who have taught one or two units of the Engineering is Elementary (EiE) curriculum for two years reported on: students' responses to design failure; the ways in which they, the teachers, supported these students and used fail words (e.g. fail, failure); and the teachers' broad perspectives and…
Extended Ponseti method for failed tenotomy in idiopathic clubfeet: a pilot study.
Agarwal, Anil; Agrawal, Nargesh; Barik, Sitanshu; Gupta, Neeraj
2018-01-29
We evaluated the outcome of a new protocol of an extended Ponseti method in the management of idiopathic club foot with residual equinus following failed Achilles tenotomy. We also compared the failed with a successful tenotomy group to analyze the parameters for failure. The Ponseti technique-treated idiopathic club foot patients with failed percutaneous Achilles tenotomy (failure to achieve <15° dorsiflexion) were treated by continued stretching casts, with a weekly change for a further 3 weeks. Final dorsiflexion more than 15° if achieved with the above protocol was recorded as a success. Twenty-six (16%) patients with failed Achilles tenotomy and residual equinus out of a total of 161 patients with primary idiopathic club foot were tested with the protocol. Ten (38.5%) failed patients had bilateral foot involvement and 16 (61.5%) had unilateral foot involvement. A total of seven (26.9%) patients achieved the end point dorsiflexion of more than 15° in one further cast, 10 (38.5%) in two casts, and four (15.4%) in three casts, respectively. Overall success of the extended Ponseti protocol was achieved in 21/26 (80.8%) patients. The patient's age, precasting initial Pirani score, number of Ponseti casts, pretenotomy Pirani score, and pretenotomy ankle joint dorsiflexion were statistically different in the failed compared with the successful tenotomy group. The tested extended Ponseti protocol showed a success rate of 80.8% in salvaging failed tenotomy cases. The failed tenotomy group was relatively older at presentation, had high precasting and pretenotomy Pirani scores, received extra number of Ponseti casts, and less pretenotomy ankle joint dorsiflexion compared with successful feet.
Schuh, L A.; London, Z; Neel, R; Brock, C; Kissela, B M.; Schultz, L; Gelb, D J.
2009-01-01
Objective: The American Board of Psychiatry and Neurology (ABPN) has recently replaced the traditional, centralized oral examination with the locally administered Neurology Clinical Skills Examination (NEX). The ABPN postulated the experience with the NEX would be similar to the Mini-Clinical Evaluation Exercise, a reliable and valid assessment tool. The reliability and validity of the NEX has not been established. Methods: NEX encounters were videotaped at 4 neurology programs. Local faculty and ABPN examiners graded the encounters using 2 different evaluation forms: an ABPN form and one with a contracted rating scale. Some NEX encounters were purposely failed by residents. Cohen’s kappa and intraclass correlation coefficients (ICC) were calculated for local vs ABPN examiners. Results: Ninety-eight videotaped NEX encounters of 32 residents were evaluated by 20 local faculty evaluators and 18 ABPN examiners. The interrater reliability for a determination of pass vs fail for each encounter was poor (kappa 0.32; 95% confidence interval [CI] = 0.11, 0.53). ICC between local faculty and ABPN examiners for each performance rating on the ABPN NEX form was poor to moderate (ICC range 0.14-0.44), and did not improve with the contracted rating form (ICC range 0.09-0.36). ABPN examiners were more likely than local examiners to fail residents. Conclusions: There is poor interrater reliability between local faculty and American Board of Psychiatry and Neurology examiners. A bias was detected for favorable assessment locally, which is concerning for the validity of the examination. Further study is needed to assess whether training can improve interrater reliability and offset bias. GLOSSARY ABIM = American Board of Internal Medicine; ABPN = American Board of Psychiatry and Neurology; CI = confidence interval; HFH = Henry Ford Hospital; ICC = intraclass correlation coefficients; IM = internal medicine; mini-CEX = Mini-Clinical Evaluation Exercise; NEX = Neurology Clinical Skills Examination; RITE = residency inservice training examination; UC = University of Cincinnati; UM = University of Michigan; USF = University of South Florida. PMID:19605769
Alpha Control and Its Mediating Effects on Pain and Anxiety
1976-03-01
their biological functions ~ hunger , thirst, dizziness, nausea, and their like. For Weber, pressure, warmth, and cold are true sensations because they...have their proper stimuli. Pc^in, on the other hand, seemed to have no proper stimulus but to represent a bodily need, like hunger or nausea. In 1840...process . 31 The traditional view of the pain mechanism failed to account for the fact that pain represented the result of at least two neuropsychological
Novel Image-Guided Management of a Uterine Arteriovenous Malformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Przybojewski, Stefan J., E-mail: drstefanp@hotmail.com; Sadler, David J.
The investigators present a novel image-guided embolization, not previously described, of a uterine arteriovenous malformation (AVM) resistant to endovascular management. The uterus was exposed surgically, and Histoacryl (Braun, Fulda, Germany) was injected directly into the nidus using ultrasound guidance and fluoroscopy. The patient had a successful full-term pregnancy after this procedure. This technique may be a useful alternative management strategy in patients with uterine AVM who fail traditional endovascular embolization and who still desire fertility.
[Microbiological rationale for using whey on salting salmon caviar].
Kim, I N; Shtan'ko, T I
2011-01-01
The paper provides a rationale for the use of whey to salt salmon fishes instead of traditional preservatives, including those exported from low industrial potential countries, which do not undergo comprehensive sanitary and hygienic tests. On the basis of the performed studies, the authors recommend to use whey to salt salmon caviar, which ensures the ecological purity of the product containing the minimum amount of preservatives and other substances that fail to affect its organoleptic properties.
NASA Astrophysics Data System (ADS)
Chu, Xiaoyu; Zhang, Jingrui; Lu, Shan; Zhang, Yao; Sun, Yue
2016-11-01
This paper presents a trajectory planning algorithm to optimise the collision avoidance of a chasing spacecraft operating in an ultra-close proximity to a failed satellite. The complex configuration and the tumbling motion of the failed satellite are considered. The two-spacecraft rendezvous dynamics are formulated based on the target body frame, and the collision avoidance constraints are detailed, particularly concerning the uncertainties. An optimisation solution of the approaching problem is generated using the Gauss pseudospectral method. A closed-loop control is used to track the optimised trajectory. Numerical results are provided to demonstrate the effectiveness of the proposed algorithms.
A Transperineal Approach to Hysterectomy of a Retained Didelphic Uterine Horn.
Mullen, Mary M; Kuroki, Lindsay M; Hunt, Steven R; Ratkowski, Kristy L; Mutch, David G
2017-09-01
Gynecologic surgeries are performed through abdominal, vaginal, laparoscopic, or robot-assisted laparoscopic routes. However, if the pelvis is not accessible by one of these routes, there are no published reports to guide pelvic surgeons. A 34-year-old conjoined twin status postseparation with uterine didelphys and absence of her left colon and sacrum underwent hemihysterectomy, at which time her müllerian anomaly was unknown. She re-presented with vaginal bleeding and pain eventually attributed to a retained uterine horn. Conservative management failed. Given dense adhesions, traditional approaches to hysterectomy were not successful. She underwent a transperineal hemisupracervical hysterectomy. We propose a novel approach to the pelvis to guide surgeons when traditional approaches are not feasible. We also describe an instance of a retained uterine didelphys horn.
In with the new, out with the old? Auto-extraction for remote sensing archaeology
NASA Astrophysics Data System (ADS)
Cowley, David C.
2012-09-01
This paper explores aspects of the inter-relationships between traditional archaeological interpretation of remote sensed data (principally visual examination of aerial photographs/satellite) and those drawing on automated feature extraction and processing. Established approaches to archaeological interpretation of aerial photographs are heavily reliant on individual observation (eye/brain) in an experience and knowledge-based process. Increasingly, however, much more complex and extensive datasets are becoming available to archaeology and these require critical reflection on analytical and interpretative processes. Archaeological applications of Airborne Laser Scanning (ALS) are becoming increasingly routine, and as the spatial resolution of hyper-spectral data improves, its potentially massive implications for archaeological site detection may prove to be a sea-change. These complex datasets demand new approaches, as traditional methods based on direct observation by an archaeological interpreter will never do more than scratch the surface, and will fail to fully extend the boundaries of knowledge. Inevitably, changing analytical and interpretative processes can create tensions, especially, as has been the case in archaeology, when the innovations in data and analysis come from outside the discipline. These tensions often centre on the character of the information produced, and a lack of clarity on the place of archaeological interpretation in the workflow. This is especially true for ALS data and autoextraction techniques, and carries implications for all forms of remote sensed archaeological datasets, including hyperspectral data and aerial photographs.
Evaluation of Double-Vacuum-Bag Process For Composite Fabrication
NASA Technical Reports Server (NTRS)
Hou, T. H.; Jensen, B. J.
2004-01-01
A non-autoclave vacuum bag process using atmospheric pressure alone that eliminates the need for external pressure normally supplied by an autoclave or a press is an attractive method for composite fabrication. This type of process does not require large capital expenditures for tooling and processing equipment. In the molding cycle (temperature/pressure profile) for a given composite system, the vacuum application point has to be carefully selected to achieve the final consolidated laminate net shape and resin content without excessive resin squeeze-out. The traditional single-vacuum- bag (SVB) process is best suited for molding epoxy matrix based composites because of their superior flow and the absence of reaction by-products or other volatiles. Other classes of materials, such as polyimides and phenolics, generate water during cure. In addition, these materials are commonly synthesized as oligomers using solvents to facilitate processability. Volatiles (solvents and reaction byproducts) management therefore becomes a critical issue. SVB molding, without additional pressure, normally fails to yield void-free quality composites for these classes of resin systems. A double-vacuum- bag (DVB) process for volatile management was envisioned, designed and built at the NASA Langley Research Center. This experimental DVB process affords superior volatiles management compared to the traditional SVB process. Void-free composites are consistently fabricated as measured by C-scan and optical photomicroscopy for high performance polyimide and phenolic resins.
Arnulf, Jan Ketil; Larsen, Kai Rune; Martinsen, Øyvind Lund; Egeland, Thore
2018-01-12
The traditional understanding of data from Likert scales is that the quantifications involved result from measures of attitude strength. Applying a recently proposed semantic theory of survey response, we claim that survey responses tap two different sources: a mixture of attitudes plus the semantic structure of the survey. Exploring the degree to which individual responses are influenced by semantics, we hypothesized that in many cases, information about attitude strength is actually filtered out as noise in the commonly used correlation matrix. We developed a procedure to separate the semantic influence from attitude strength in individual response patterns, and compared these results to, respectively, the observed sample correlation matrices and the semantic similarity structures arising from text analysis algorithms. This was done with four datasets, comprising a total of 7,787 subjects and 27,461,502 observed item pair responses. As we argued, attitude strength seemed to account for much information about the individual respondents. However, this information did not seem to carry over into the observed sample correlation matrices, which instead converged around the semantic structures offered by the survey items. This is potentially disturbing for the traditional understanding of what survey data represent. We argue that this approach contributes to a better understanding of the cognitive processes involved in survey responses. In turn, this could help us make better use of the data that such methods provide.
Health Insurance Coverage and Take-Up: Lessons from Behavioral Economics
Baicker, Katherine; Congdon, William J; Mullainathan, Sendhil
2012-01-01
Context Millions of uninsured Americans ostensibly have insurance available to them—many at very low cost—but do not take it up. Traditional economic analysis is based on the premise that these are rational decisions, but it is hard to reconcile observed enrollment patterns with this view. The policy prescriptions that the traditional model generates may thus fail to achieve their goals. Behavioral economics, which integrates insights from psychology into economic analysis, identifies important deviations from the traditional assumptions of rationality and can thus improve our understanding of what drives health insurance take-up and improved policy design. Methods Rather than a systematic review of the coverage literature, this article is a primer for considering issues in health insurance coverage from a behavioral economics perspective, supplementing the standard model. We present relevant evidence on decision making and insurance take-up and use it to develop a behavioral approach to both the policy problem posed by the lack of health insurance coverage and possible policy solutions to that problem. Findings We found that evidence from behavioral economics can shed light on both the sources of low take-up and the efficacy of different policy levers intended to expand coverage. We then applied these insights to policy design questions for public and private insurance coverage and to the implementation of the recently enacted health reform, focusing on the use of behavioral insights to maximize the value of spending on coverage. Conclusions We concluded that the success of health insurance coverage reform depends crucially on understanding the behavioral barriers to take-up. The take-up process is likely governed by psychology as much as economics, and public resources can likely be used much more effectively with behaviorally informed policy design. PMID:22428694
Bourguignon, Thomas; Šobotník, Jan; Hanus, Robert; Krasulová, Jana; Vrkoslav, Vladimír; Cvačka, Josef; Roisin, Yves
2013-12-01
Species boundaries are traditionally inferred using morphological characters, although morphology sometimes fails to correctly delineate species. To overcome this limitation, researchers have widely taken advantage of alternative methods such as DNA barcoding or analysis of cuticular hydrocarbons (CHs) profiles, but rarely use them simultaneously in an iterative taxonomic approach. Here, we follow such an approach using morphology, DNA barcoding and CHs profiles to precisely discriminate species of soldierless termites, a diversified clade constituting about one-third of the Neotropical termite species richness, but poorly resolved taxonomically due to the paucity of useful characters. We sampled soldierless termites in various forest types of the Nouragues Nature Reserve, French Guiana. Our results show that morphological species determination generally matches DNA barcoding, which only suggests the existence of three cryptic species in the 31 morphological species. Among them, Longustitermes manni is the only species whose splitting is corroborated by ecological data, other widely distributed species being supported by DNA barcoding. On the contrary, although CHs profiles provide a certain taxonomic signal, they often suggest inconsistent groupings which are not supported by other methods. Overall, our data support DNA barcoding and morphology as two efficient methods to distinguish soldierless termite species. Copyright © 2013 Elsevier Inc. All rights reserved.
Managing Drawbacks in Unconventional Successful Glaucoma Surgery: A Case Report of Stent Exposure
Fea, Antonio; Cannizzo, Paola Maria Loredana; Consolandi, Giulia; Lavia, Carlo Alessandro; Pignata, Giulia; Grignolo, Federico M.
2015-01-01
Traditional options in managing failed trabeculectomy (bleb needling, revision, additional incisional surgery and tube surgery) have a relatively high failure and complication rate. The use of microinvasive glaucoma surgery (MIGS) has generally been reserved to mild to moderate glaucoma cases, proving good safety profiles but significant limitations in terms of efficacy. We describe a patient who underwent MIGS (XEN Aquesys subconjunctival shunt implantation) after a prior failed trabeculectomy. After the surgery, the IOP was well controlled but as the stent was close to an area of scarred conjunctiva of the previous trabeculectomy, it became partially exposed. As a complete success was achieved, we decided to remove the conjunctiva over the exposed area and replace it by an amniotic membrane transplantation and a conjunctiva autograft. Six months after surgery, the unmedicated IOP is still well controlled with complete visual acuity recovery. PMID:26294994
Zuckerman, Jack M; Shekarriz, Bijan; Upadhyay, Jyoti
2013-02-01
Continuous urinary incontinence in female patients can be a diagnostic dilemma if traditional imaging fails to identify a source. Vaginography has been used to diagnose vaginal ectopic ureters in the past with mixed results. Institutional review board approval was obtained for a retrospective review. Five teenage females with continuous incontinence and prior negative imaging work ups underwent high pressure vaginography. Their findings and treatment outcomes are reviewed. A vaginal ectopic ureter was diagnosed in each of the five patients at a mean age 15.8 years. Each had undergone prior magnetic resonance urography that was non-diagnostic. Four of the five were managed surgically with resolution of their incontinence. One was lost to follow up. High pressure vaginogram should be considered during the work up of female patients with continuous urinary incontinence, especially when other imaging modalities fail to identify an etiology.
Intrinsic two-dimensional features as textons
NASA Technical Reports Server (NTRS)
Barth, E.; Zetzsche, C.; Rentschler, I.
1998-01-01
We suggest that intrinsic two-dimensional (i2D) features, computationally defined as the outputs of nonlinear operators that model the activity of end-stopped neurons, play a role in preattentive texture discrimination. We first show that for discriminable textures with identical power spectra the predictions of traditional models depend on the type of nonlinearity and fail for energy measures. We then argue that the concept of intrinsic dimensionality, and the existence of end-stopped neurons, can help us to understand the role of the nonlinearities. Furthermore, we show examples in which models without strong i2D selectivity fail to predict the correct ranking order of perceptual segregation. Our arguments regarding the importance of i2D features resemble the arguments of Julesz and co-workers regarding textons such as terminators and crossings. However, we provide a computational framework that identifies textons with the outputs of nonlinear operators that are selective to i2D features.
The reliability of the pass/fail decision for assessments comprised of multiple components
Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana
2015-01-01
Objective: The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When “conjunctively” combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. Method: The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg’s Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Results: Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. Conclusion: The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached – for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements. PMID:26483855
Yu, Bin-Sheng; Yang, Zhan-Kun; Li, Ze-Min; Zeng, Li-Wen; Wang, Li-Bing; Lu, William Weijia
2011-08-01
An in vitro biomechanical cadaver study. To evaluate the pull-out strength after 5000 cyclic loading among 4 revision techniques for the loosened iliac screw using corticocancellous bone, longer screw, traditional cement augmentation, and boring cement augmentation. Iliac screw loosening is still a clinical problem for lumbo-iliac fusion. Although many revision techniques using corticocancellous bone, larger screw, and polymethylmethacrylate (PMMA) augmentation were applied in repairing pedicle screw loosening, their biomechanical effects on the loosened iliac screw remain undetermined. Eight fresh human cadaver pelvises with the bone mineral density values ranging from 0.83 to 0.97 g/cm were adopted in this study. After testing the primary screw of 7.5 mm diameter and 70 mm length, 4 revision techniques were sequentially established and tested on the same pelvis as follows: corticocancellous bone, longer screw with 100 mm length, traditional PMMA augmentation, and boring PMMA augmentation. The difference of the boring technique from traditional PMMA augmentation is that PMMA was injected into the screw tract through 3 boring holes of outer cortical shell without removing the screw. On an MTS machine, after 5000 cyclic compressive loading of -200∼-500 N to the screw head, axial maximum pull-out strengths of the 5 screws were measured and analyzed. The pull-out strengths of the primary screw and 4 revised screws with corticocancellous bone, longer screw and traditional and boring PMMA augmentation were 1167 N, 361 N, 854 N, 1954 N, and 1820 N, respectively. Although longer screw method obtained significantly higher pull-out strength than corticocancellous bone (P<0.05), the revised screws using these 2 techniques exhibited notably lower pull-out strength than the primary screw and 2 PMMA-augmented screws (P<0.05). Either traditional or boring PMMA screw showed obviously higher pull-out strength than the primary screw (P<0.05); however, no significant difference of pull-out strength was detected between the 2 PMMA screws (P>0.05). Wadding corticocancellous bone and increasing screw length failed to provide sufficient anchoring strength for a loosened iliac screw; however, both traditional and boring PMMA-augmented techniques could effectively increase the fixation strength. On the basis of the viewpoint of minimal invasion, the boring PMMA augmentation may serve as a suitable salvage technique for iliac screw loosening.
Rossetti, Gianluca; del Genio, Gianmattia; Maffettone, Vincenzo; Fei, Landino; Brusciano, Luigi; Limongelli, Paolo; Pizza, Francesco; Tolone, Salvatore; Di Martino, Maria; del Genio, Federica; del Genio, Alberto
2009-01-01
Laparoscopic Heller myotomy with antireflux procedure seems the procedure of choice in the treatment of patients with esophageal achalasia. Persistent or recurrent symptoms occur in 10% to 20% of patients. Few reports on reoperation after failed Heller myotomy have been published. No author has reported the realization of a total fundoplication in these patient groups. The aim of this study is to evaluate the efficacy of laparoscopic reoperation with the realization of a total fundoplication after failed Heller myotomy for esophageal achalasia. From 1992 to December 2007, 5 out of a series of 242 patients (2.1%), along with 2 patients operated elsewhere, underwent laparoscopic reintervention for failed Heller myotomy. Symptoms leading to reoperation included persistent dysphagia in 3 patients, recurrent dysphagia in another 3, and heartburn in 1 patient. Mean time from the first to the second operation was 49.7 months (range, 4-180 months). Always, the intervention was completed via a laparoscopic approach and a Nissen-Rossetti fundoplication was realized or left in place after a complete Heller myotomy. Mean operative time was 160 minutes (range, 60-245 minutes). Mean postoperative hospital stay was 3.1 +/- 1.5 days. No major morbidity or mortality occurred. At a mean follow-up of 16.1 months, reoperation must be considered successful in 5 out of 7 patients (71.4%). The dysphagia DeMeester score fell from 2.71 +/- 0.22 to 0.91 +/- 0.38 postoperatively. The regurgitation score changed from 2.45 +/- 0.34 to 0.68 +/- 0.23. Laparoscopic reoperation for failed Heller myotomy with the realization of a total fundoplication is safe and is associated with good long-term results if performed by an experienced surgeon in a center with a long tradition of esophageal surgery.
Revision open Bankart surgery after arthroscopic repair for traumatic anterior shoulder instability.
Cho, Nam Su; Yi, Jin Woong; Lee, Bong Gun; Rhee, Yong Girl
2009-11-01
Only a few studies have provided homogeneous analysis of open revision surgery after a failed arthroscopic Bankart procedure. Open Bankart revision surgery will be effective in a failed arthroscopic anterior stabilization but inevitably results in a loss of range of motion, especially external rotation. Case series; Level of evidence, 4. Twenty-six shoulders that went through traditional open Bankart repair as revision surgery after a failed arthroscopic Bankart procedure for traumatic anterior shoulder instability were enrolled for this study. The mean patient age at the time of revision surgery was 24 years (range, 16-38 years), and the mean duration of follow-up was 42 months (range, 25-97 months). The preoperative mean range of motion was 173 degrees in forward flexion and 65 degrees in external rotation at the side. After revision surgery, the ranges measured 164 degrees and 55 degrees, respectively (P = .024 and .012, respectively). At the last follow-up, the mean Rowe score was 81 points, with 88.5% of the patients reporting good or excellent results. After revision surgery, redislocation developed in 3 shoulders (11.5%), all of which had an engaging Hill-Sachs lesion and associated hyperlaxity (2+ or greater laxity on the sulcus sign). Open revision Bankart surgery for a failed arthroscopic Bankart repair can provide a satisfactory outcome, including a low recurrence rate and reliable functional return. In open revision Bankart surgery after failed stabilization for traumatic anterior shoulder instability, the surgeon should keep in mind the possibility of a postoperative loss of range of motion and a thorough examination for not only a Bankart lesion but also other associated lesions, including a bone defect or hyperlaxity, to lower the risk of redislocation.
NASA Astrophysics Data System (ADS)
Chattopadhyay, Sudip; Pahari, Dola; Mukherjee, Debashis; Mahapatra, Uttam Sinha
2004-04-01
The traditional multireference (MR) coupled-cluster (CC) methods based on the effective Hamiltonian are often beset by the problem of intruder states, and are not suitable for studying potential energy surface (PES) involving real or avoided curve crossing. State-specific MR-based approaches obviate this limitation. The state-specific MRCC (SS-MRCC) method developed some years ago [Mahapatra et al., J. Chem. Phys. 110, 6171 (1999)] can handle quasidegeneracy of varying degrees over a wide range of PES, including regions of real or avoided curve-crossing. Motivated by its success, we have suggested and explored in this paper a suite of physically motivated coupled electron-pair approximations (SS-MRCEPA) like methods, which are designed to capture the essential strength of the parent SS-MRCC method without significant sacrificing its accuracy. These SS-MRCEPA theories, like their CC counterparts, are based on complete active space, treat all the reference functions on the same footing and provide a description of potentially uniform precision of PES of states with varying MR character. The combining coefficients of the reference functions are self-consistently determined along with the cluster amplitudes themselves. The newly developed SS-MRCEPA methods are size-extensive, and are also size-consistent with localized orbitals. Among the various versions, there are two which are invariant with respect to the restricted rotations among doubly occupied and active orbitals separately. Similarity of performance of this latter and the noninvariant versions at the crossing points of the degenerate orbitals imply that the all the methods presented are rather robust with respect to the rotations among degenerate orbitals. Illustrative numerical applications are presented for PES of the ground state of a number of difficult test cases such as the model H4, H8 problems, the insertion of Be into H2, and Li2, where intruders exist and for a state of a molecule such as CH2, with pronounced MR character. Results obtained with SS-MRCEPA methods are found to be comparable in accuracy to the parent SS-MRCC and FCI/large scale CI results throughout the PES, which indicates the efficacy of our SS-MRCEPA methods over a wide range of geometries, despite their neglect of a host of complicated nonlinear terms, even when the traditional MR-based methods based on effective Hamiltonians fail due to intruders.
ERIC Educational Resources Information Center
Chiado, Wendy S.
2012-01-01
Too many of our nation's youth have failed to complete high school. Determining why so many of our nation's students fail to graduate is a complex, multi-faceted problem and beyond the scope of any one study. The study presented herein utilized a thirteen-step mixed methods model developed by Leech and Onwuegbuzie (2007) to demonstrate within a…
Jiang, Zhehan; Skorupski, William
2017-12-12
In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.
Differential dynamic microscopy of bidisperse colloidal suspensions.
Safari, Mohammad S; Poling-Skutvik, Ryan; Vekilov, Peter G; Conrad, Jacinta C
2017-01-01
Research tasks in microgravity include monitoring the dynamics of constituents of varying size and mobility in processes such as aggregation, phase separation, or self-assembly. We use differential dynamic microscopy, a method readily implemented with equipment available on the International Space Station, to simultaneously resolve the dynamics of particles of radius 50 nm and 1 μm in bidisperse aqueous suspensions. Whereas traditional dynamic light scattering fails to detect a signal from the larger particles at low concentrations, differential dynamic microscopy exhibits enhanced sensitivity in these conditions by accessing smaller wavevectors where scattering from the large particles is stronger. Interference patterns due to scattering from the large particles induce non-monotonic decay of the amplitude of the dynamic correlation function with the wavevector. We show that the position of the resulting minimum contains information on the vertical position of the particles. Together with the simple instrumental requirements, the enhanced sensitivity of differential dynamic microscopy makes it an appealing alternative to dynamic light scattering to characterize samples with complex dynamics.
Improving estimates of air pollution exposure through ubiquitous sensing technologies
de Nazelle, Audrey; Seto, Edmund; Donaire-Gonzalez, David; Mendez, Michelle; Matamala, Jaume; Nieuwenhuijsen, Mark J; Jerrett, Michael
2013-01-01
Traditional methods of exposure assessment in epidemiological studies often fail to integrate important information on activity patterns, which may lead to bias, loss of statistical power or both in health effects estimates. Novel sensing technologies integrated with mobile phones offer potential to reduce exposure measurement error. We sought to demonstrate the usability and relevance of the CalFit smartphone technology to track person-level time, geographic location, and physical activity patterns for improved air pollution exposure assessment. We deployed CalFit-equipped smartphones in a free living-population of 36 subjects in Barcelona, Spain. Information obtained on physical activity and geographic location was linked to space-time air pollution mapping. For instance, we found on average travel activities accounted for 6% of people’s time and 24% of their daily inhaled NO2. Due to the large number of mobile phone users, this technology potentially provides an unobtrusive means of collecting epidemiologic exposure data at low cost. PMID:23416743
An improved robust buffer allocation method for the project scheduling problem
NASA Astrophysics Data System (ADS)
Ghoddousi, Parviz; Ansari, Ramin; Makui, Ahmad
2017-04-01
Unpredictable uncertainties cause delays and additional costs for projects. Often, when using traditional approaches, the optimizing procedure of the baseline project plan fails and leads to delays. In this study, a two-stage multi-objective buffer allocation approach is applied for robust project scheduling. In the first stage, some decisions are made on buffer sizes and allocation to the project activities. A set of Pareto-optimal robust schedules is designed using the meta-heuristic non-dominated sorting genetic algorithm (NSGA-II) based on the decisions made in the buffer allocation step. In the second stage, the Pareto solutions are evaluated in terms of the deviation from the initial start time and due dates. The proposed approach was implemented on a real dam construction project. The outcomes indicated that the obtained buffered schedule reduces the cost of disruptions by 17.7% compared with the baseline plan, with an increase of about 0.3% in the project completion time.
“Heroes” and “Villains” of World History across Cultures
Hanke, Katja; Liu, James H.; Sibley, Chris G.; Paez, Dario; Gaines, Stanley O.; Moloney, Gail; Leong, Chan-Hoong; Wagner, Wolfgang; Licata, Laurent; Klein, Olivier; Garber, Ilya; Böhm, Gisela; Hilton, Denis J.; Valchev, Velichko; Khan, Sammyh S.; Cabecinhas, Rosa
2015-01-01
Emergent properties of global political culture were examined using data from the World History Survey (WHS) involving 6,902 university students in 37 countries evaluating 40 figures from world history. Multidimensional scaling and factor analysis techniques found only limited forms of universality in evaluations across Western, Catholic/Orthodox, Muslim, and Asian country clusters. The highest consensus across cultures involved scientific innovators, with Einstein having the most positive evaluation overall. Peaceful humanitarians like Mother Theresa and Gandhi followed. There was much less cross-cultural consistency in the evaluation of negative figures, led by Hitler, Osama bin Laden, and Saddam Hussein. After more traditional empirical methods (e.g., factor analysis) failed to identify meaningful cross-cultural patterns, Latent Profile Analysis (LPA) was used to identify four global representational profiles: Secular and Religious Idealists were overwhelmingly prevalent in Christian countries, and Political Realists were common in Muslim and Asian countries. We discuss possible consequences and interpretations of these different representational profiles. PMID:25651504
Photorefraction Screens Millions for Vision Disorders
NASA Technical Reports Server (NTRS)
2008-01-01
Who would have thought that stargazing in the 1980s would lead to hundreds of thousands of schoolchildren seeing more clearly today? Collaborating with research ophthalmologists and optometrists, Marshall Space Flight Center scientists Joe Kerr and the late John Richardson adapted optics technology for eye screening methods using a process called photorefraction. Photorefraction consists of delivering a light beam into the eyes where it bends in the ocular media, hits the retina, and then reflects as an image back to a camera. A series of refinements and formal clinical studies followed their highly successful initial tests in the 1980s. Evaluating over 5,000 subjects in field tests, Kerr and Richardson used a camera system prototype with a specifically angled telephoto lens and flash to photograph a subject s eye. They then analyzed the image, the cornea and pupil in particular, for irregular reflective patterns. Early tests of the system with 1,657 Alabama children revealed that, while only 111 failed the traditional chart test, Kerr and Richardson s screening system found 507 abnormalities.
A review on sludge dewatering indices.
To, Vu Hien Phuong; Nguyen, Tien Vinh; Vigneswaran, Saravanamuth; Ngo, Huu Hao
2016-01-01
Dewatering of sludge from sewage treatment plants is proving to be a significant challenge due to the large amounts of residual sludges generated annually. In recent years, research and development have focused on improving the dewatering process in order to reduce subsequent costs of sludge management and transport. To achieve this goal, it is necessary to establish reliable indices that reflect the efficiency of sludge dewatering. However, the evaluation of sludge dewaterability is not an easy task due to the highly complex nature of sewage sludge and variations in solid-liquid separation methods. Most traditional dewatering indices fail to predict the maximum cake solids content achievable during full-scale dewatering. This paper reviews the difficulties in assessing sludge dewatering performance, and the main techniques used to evaluate dewatering performance are compared and discussed in detail. Finally, the paper suggests a new dewatering index, namely the modified centrifugal index, which is demonstrated to be an appropriate indicator for estimating the final cake solids content as well as simulating the prototype dewatering process.
Recurrence-plot-based measures of complexity and their application to heart-rate-variability data.
Marwan, Norbert; Wessel, Niels; Meyerfeldt, Udo; Schirdewan, Alexander; Kurths, Jürgen
2002-08-01
The knowledge of transitions between regular, laminar or chaotic behaviors is essential to understand the underlying mechanisms behind complex systems. While several linear approaches are often insufficient to describe such processes, there are several nonlinear methods that, however, require rather long time observations. To overcome these difficulties, we propose measures of complexity based on vertical structures in recurrence plots and apply them to the logistic map as well as to heart-rate-variability data. For the logistic map these measures enable us not only to detect transitions between chaotic and periodic states, but also to identify laminar states, i.e., chaos-chaos transitions. The traditional recurrence quantification analysis fails to detect the latter transitions. Applying our measures to the heart-rate-variability data, we are able to detect and quantify the laminar phases before a life-threatening cardiac arrhythmia occurs thereby facilitating a prediction of such an event. Our findings could be of importance for the therapy of malignant cardiac arrhythmias.
Microchip-Based Single-Cell Functional Proteomics for Biomedical Applications
Lu, Yao; Yang, Liu; Wei, Wei; Shi, Qihui
2017-01-01
Cellular heterogeneity has been widely recognized but only recently have single cell tools become available that allow characterizing heterogeneity at the genomic and proteomic levels. We review the technological advances in microchip-based toolkits for single-cell functional proteomics. Each of these tools has distinct advantages and limitations, and a few have advanced toward being applied to address biological or clinical problems that fail to be addressed by traditional population-based methods. High-throughput single-cell proteomic assays generate high-dimensional data sets that contain new information and thus require developing new analytical framework to extract new biology. In this review article, we highlight a few biological and clinical applications in which the microchip-based single-cell proteomic tools provide unique advantages. The examples include resolving functional heterogeneity and dynamics of immune cells, dissecting cell-cell interaction by creating well-contolled on-chip microenvironment, capturing high-resolution snapshots of immune system functions in patients for better immunotherapy and elucidating phosphoprotein signaling networks in cancer cells for guiding effective molecularly targeted therapies. PMID:28280819
Nathaniel Hodges (1629-1688): Plague doctor.
Duffin, Christopher J
2016-02-01
Nathaniel Hodges was the son of Thomas Hodges (1605-1672), an influential Anglican preacher and reformer with strong connections in the political life of Carolingian London. Educated at Westminster School, Trinity College Cambridge and Christ Church College, Oxford, Nathaniel established himself as a physician in Walbrook Ward in the City of London. Prominent as one of a handful of medical men who remained in London during the time of the Great Plague of 1665, he wrote the definitive work on the outbreak. His daily precautions against contracting the disease included fortifying himself with Théodore de Mayerne's antipestilential electuary and the liberal consumption of Sack. Hodges' approach to the treatment of plague victims was empathetic and based on the traditional Galenic method rather than Paracelsianism although he was pragmatic in the rejection of formulae and simples which he judged from experience to be ineffective. Besieged by financial problems in later life, his practice began to fail in the 1680s and he eventually died in a debtor's prison. © The Author(s) 2014.
Shamim, Mohammad Tabrez Anwar; Anwaruddin, Mohammad; Nagarajaram, H A
2007-12-15
Fold recognition is a key step in the protein structure discovery process, especially when traditional sequence comparison methods fail to yield convincing structural homologies. Although many methods have been developed for protein fold recognition, their accuracies remain low. This can be attributed to insufficient exploitation of fold discriminatory features. We have developed a new method for protein fold recognition using structural information of amino acid residues and amino acid residue pairs. Since protein fold recognition can be treated as a protein fold classification problem, we have developed a Support Vector Machine (SVM) based classifier approach that uses secondary structural state and solvent accessibility state frequencies of amino acids and amino acid pairs as feature vectors. Among the individual properties examined secondary structural state frequencies of amino acids gave an overall accuracy of 65.2% for fold discrimination, which is better than the accuracy by any method reported so far in the literature. Combination of secondary structural state frequencies with solvent accessibility state frequencies of amino acids and amino acid pairs further improved the fold discrimination accuracy to more than 70%, which is approximately 8% higher than the best available method. In this study we have also tested, for the first time, an all-together multi-class method known as Crammer and Singer method for protein fold classification. Our studies reveal that the three multi-class classification methods, namely one versus all, one versus one and Crammer and Singer method, yield similar predictions. Dataset and stand-alone program are available upon request.
Two-and-a-half-year-olds succeed at a traditional false-belief task with reduced processing demands.
Setoh, Peipei; Scott, Rose M; Baillargeon, Renée
2016-11-22
When tested with traditional false-belief tasks, which require answering a standard question about the likely behavior of an agent with a false belief, children perform below chance until age 4 y or later. When tested without such questions, however, children give evidence of false-belief understanding much earlier. Are traditional tasks difficult because they tap a more advanced form of false-belief understanding (fundamental-change view) or because they impose greater processing demands (processing-demands view)? Evidence that young children succeed at traditional false-belief tasks when processing demands are reduced would support the latter view. In prior research, reductions in inhibitory-control demands led to improvements in young children's performance, but often only to chance (instead of below-chance) levels. Here we examined whether further reductions in processing demands might lead to success. We speculated that: (i) young children could respond randomly in a traditional low-inhibition task because their limited information-processing resources are overwhelmed by the total concurrent processing demands in the task; and (ii) these demands include those from the response-generation process activated by the standard question. This analysis suggested that 2.5-y-old toddlers might succeed at a traditional low-inhibition task if response-generation demands were also reduced via practice trials. As predicted, toddlers performed above chance following two response-generation practice trials; toddlers failed when these trials either were rendered less effective or were used in a high-inhibition task. These results support the processing-demands view: Even toddlers succeed at a traditional false-belief task when overall processing demands are reduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theis, T.; Feng, Y.; Wu, T.
2014-01-07
Hyperpolarization methods, which can enhance nuclear spin signals by orders of magnitude, open up important new opportunities in magnetic resonance. However, many of these applications are limited by spin lattice relaxation, which typically destroys the hyperpolarization in seconds. Significant lifetime enhancements have been found with “disconnected eigenstates” such as the singlet state between a pair of nearly equivalent spins, or the “singlet-singlet” state involving two pairs of chemically equivalent spins; the challenge is to populate these states (for example, from thermal equilibrium magnetization or hyperpolarization) and to later recall the population into observable signal. Existing methods for populating these statesmore » are limited by either excess energy dissipation or high sensitivity to inhomogeneities. Here we overcome the limitations by extending recent work using continuous-wave irradiation to include composite and adiabatic pulse excitations. Traditional composite and adiabatic pulses fail completely in this problem because the interactions driving the transitions are fundamentally different, but the new shapes we introduce can move population between accessible and disconnected eigenstates over a wide range of radio-frequency (RF) amplitudes and offsets while depositing insignificant amounts of power.« less
Infected total knee arthroplasty treated with arthrodesis using a modular nail.
Waldman, B J; Mont, M A; Payman, K R; Freiberg, A A; Windsor, R E; Sculco, T P; Hungerford, D S
1999-10-01
Failed treatment of infected total knee replacement presents few attractive surgical options. Knee arthrodesis is challenging surgically and can be complicated by nonunion, malunion, or recurrent infection. Recently, a modular titanium intramedullary nail has been used in an attempt to reduce the incidence of nonunion and the rate of complications. In the present study, a review of the results of knee arthrodesis after infected total knee arthroplasty in 21 patients at three large academic institutions was performed. All patients were followed up for a mean of 2.4 years (range, 2-7.5 years). The mean age of the patients was 64 years. The mean number of previous operations was four (range, 2-9 operations). A solid arthrodesis was achieved without additional surgical treatment in 20 of 21 patients (95%). The mean time to fusion was 6.3 months. The one patient who suffered a nonunion achieved fusion after a subsequent bone grafting procedure. Based on the present study, intramedullary arthrodesis with a coupled titanium nail, is a reliable, effective method of achieving fusion after infection of a total knee arthroplasty. This procedure resulted in a high rate of fusion and a lower rate of complications when compared with traditional methods of arthrodesis.
Topic segmentation via community detection in complex networks
NASA Astrophysics Data System (ADS)
de Arruda, Henrique F.; Costa, Luciano da F.; Amancio, Diego R.
2016-06-01
Many real systems have been modeled in terms of network concepts, and written texts are a particular example of information networks. In recent years, the use of network methods to analyze language has allowed the discovery of several interesting effects, including the proposition of novel models to explain the emergence of fundamental universal patterns. While syntactical networks, one of the most prevalent networked models of written texts, display both scale-free and small-world properties, such a representation fails in capturing other textual features, such as the organization in topics or subjects. We propose a novel network representation whose main purpose is to capture the semantical relationships of words in a simple way. To do so, we link all words co-occurring in the same semantic context, which is defined in a threefold way. We show that the proposed representations favor the emergence of communities of semantically related words, and this feature may be used to identify relevant topics. The proposed methodology to detect topics was applied to segment selected Wikipedia articles. We found that, in general, our methods outperform traditional bag-of-words representations, which suggests that a high-level textual representation may be useful to study the semantical features of texts.
A Hybrid Monte Carlo importance sampling of rare events in Turbulence and in Turbulent Models
NASA Astrophysics Data System (ADS)
Margazoglou, Georgios; Biferale, Luca; Grauer, Rainer; Jansen, Karl; Mesterhazy, David; Rosenow, Tillmann; Tripiccione, Raffaele
2017-11-01
Extreme and rare events is a challenging topic in the field of turbulence. Trying to investigate those instances through the use of traditional numerical tools turns to be a notorious task, as they fail to systematically sample the fluctuations around them. On the other hand, we propose that an importance sampling Monte Carlo method can selectively highlight extreme events in remote areas of the phase space and induce their occurrence. We present a brand new computational approach, based on the path integral formulation of stochastic dynamics, and employ an accelerated Hybrid Monte Carlo (HMC) algorithm for this purpose. Through the paradigm of stochastic one-dimensional Burgers' equation, subjected to a random noise that is white-in-time and power-law correlated in Fourier space, we will prove our concept and benchmark our results with standard CFD methods. Furthermore, we will present our first results of constrained sampling around saddle-point instanton configurations (optimal fluctuations). The research leading to these results has received funding from the EU Horizon 2020 research and innovation programme under Grant Agreement No. 642069, and from the EU Seventh Framework Programme (FP7/2007-2013) under ERC Grant Agreement No. 339032.
Testing for Depéret's Rule (Body Size Increase) in Mammals using Combined Extinct and Extant Data
Bokma, Folmer; Godinot, Marc; Maridet, Olivier; Ladevèze, Sandrine; Costeur, Loïc; Solé, Floréal; Gheerbrant, Emmanuel; Peigné, Stéphane; Jacques, Florian; Laurin, Michel
2016-01-01
Whether or not evolutionary lineages in general show a tendency to increase in body size has often been discussed. This tendency has been dubbed “Cope's rule” but because Cope never hypothesized it, we suggest renaming it after Depéret, who formulated it clearly in 1907. Depéret's rule has traditionally been studied using fossil data, but more recently a number of studies have used present-day species. While several paleontological studies of Cenozoic placental mammals have found support for increasing body size, most studies of extant placentals have failed to detect such a trend. Here, we present a method to combine information from present-day species with fossil data in a Bayesian phylogenetic framework. We apply the method to body mass estimates of a large number of extant and extinct mammal species, and find strong support for Depéret's rule. The tendency for size increase appears to be driven not by evolution toward larger size in established species, but by processes related to the emergence of new species. Our analysis shows that complementary data from extant and extinct species can greatly improve inference of macroevolutionary processes. PMID:26508768
Kranstauber, Bart; Kays, Roland; Lapoint, Scott D; Wikelski, Martin; Safi, Kamran
2012-07-01
1. The recently developed Brownian bridge movement model (BBMM) has advantages over traditional methods because it quantifies the utilization distribution of an animal based on its movement path rather than individual points and accounts for temporal autocorrelation and high data volumes. However, the BBMM assumes unrealistic homogeneous movement behaviour across all data. 2. Accurate quantification of the utilization distribution is important for identifying the way animals use the landscape. 3. We improve the BBMM by allowing for changes in behaviour, using likelihood statistics to determine change points along the animal's movement path. 4. This novel extension, outperforms the current BBMM as indicated by simulations and examples of a territorial mammal and a migratory bird. The unique ability of our model to work with tracks that are not sampled regularly is especially important for GPS tags that have frequent failed fixes or dynamic sampling schedules. Moreover, our model extension provides a useful one-dimensional measure of behavioural change along animal tracks. 5. This new method provides a more accurate utilization distribution that better describes the space use of realistic, behaviourally heterogeneous tracks. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.
Problematizing the concept of the "borderline" group in performance assessments.
Homer, Matt; Pell, Godfrey; Fuller, Richard
2017-05-01
Many standard setting procedures focus on the performance of the "borderline" group, defined through expert judgments by assessors. In performance assessments such as Objective Structured Clinical Examinations (OSCEs), these judgments usually apply at the station level. Using largely descriptive approaches, we analyze the assessment profile of OSCE candidates at the end of a five year undergraduate medical degree program to investigate the consistency of the borderline group across stations. We look specifically at those candidates who are borderline in individual stations, and in the overall assessment. While the borderline group can be clearly defined at the individual station level, our key finding is that the membership of this group varies considerably across stations. These findings pose challenges for some standard setting methods, particularly the borderline group and objective borderline methods. They also suggest that institutions should ensure appropriate conjunctive rules to limit compensation in performance between stations to maximize "diagnostic accuracy". In addition, this work highlights a key benefit of sequential testing formats in OSCEs. In comparison with a traditional, single-test format, sequential models allow assessment of "borderline" candidates across a wider range of content areas with concomitant improvements in pass/fail decision-making.
[Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].
de Almeida, Ildeberto Muniz; Vilela, Rodolfo Andrade de Gouveia; da Silva, Alessandro José Nunes; Beltran, Sandra Lorena
2014-12-01
The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events.
Towards Large Eddy Simulation of gas turbine compressors
NASA Astrophysics Data System (ADS)
McMullan, W. A.; Page, G. J.
2012-07-01
With increasing computing power, Large Eddy Simulation could be a useful simulation tool for gas turbine axial compressor design. This paper outlines a series of simulations performed on compressor geometries, ranging from a Controlled Diffusion Cascade stator blade to the periodic sector of a stage in a 3.5 stage axial compressor. The simulation results show that LES may offer advantages over traditional RANS methods when off-design conditions are considered - flow regimes where RANS models often fail to converge. The time-dependent nature of LES permits the resolution of transient flow structures, and can elucidate new mechanisms of vorticity generation on blade surfaces. It is shown that accurate LES is heavily reliant on both the near-wall mesh fidelity and the ability of the imposed inflow condition to recreate the conditions found in the reference experiment. For components embedded in a compressor this requires the generation of turbulence fluctuations at the inlet plane. A recycling method is developed that improves the quality of the flow in a single stage calculation of an axial compressor, and indicates that future developments in both the recycling technique and computing power will bring simulations of axial compressors within reach of industry in the coming years.
NASA Astrophysics Data System (ADS)
E, Jianwei; Bao, Yanling; Ye, Jimin
2017-10-01
As one of the most vital energy resources in the world, crude oil plays a significant role in international economic market. The fluctuation of crude oil price has attracted academic and commercial attention. There exist many methods in forecasting the trend of crude oil price. However, traditional models failed in predicting accurately. Based on this, a hybrid method will be proposed in this paper, which combines variational mode decomposition (VMD), independent component analysis (ICA) and autoregressive integrated moving average (ARIMA), called VMD-ICA-ARIMA. The purpose of this study is to analyze the influence factors of crude oil price and predict the future crude oil price. Major steps can be concluded as follows: Firstly, applying the VMD model on the original signal (crude oil price), the modes function can be decomposed adaptively. Secondly, independent components are separated by the ICA, and how the independent components affect the crude oil price is analyzed. Finally, forecasting the price of crude oil price by the ARIMA model, the forecasting trend demonstrates that crude oil price declines periodically. Comparing with benchmark ARIMA and EEMD-ICA-ARIMA, VMD-ICA-ARIMA can forecast the crude oil price more accurately.
Cross-domain active learning for video concept detection
NASA Astrophysics Data System (ADS)
Li, Huan; Li, Chao; Shi, Yuan; Xiong, Zhang; Hauptmann, Alexander G.
2011-08-01
As video data from a variety of different domains (e.g., news, documentaries, entertainment) have distinctive data distributions, cross-domain video concept detection becomes an important task, in which one can reuse the labeled data of one domain to benefit the learning task in another domain with insufficient labeled data. In this paper, we approach this problem by proposing a cross-domain active learning method which iteratively queries labels of the most informative samples in the target domain. Traditional active learning assumes that the training (source domain) and test data (target domain) are from the same distribution. However, it may fail when the two domains have different distributions because querying informative samples according to a base learner that initially learned from source domain may no longer be helpful for the target domain. In our paper, we use the Gaussian random field model as the base learner which has the advantage of exploring the distributions in both domains, and adopt uncertainty sampling as the query strategy. Additionally, we present an instance weighting trick to accelerate the adaptability of the base learner, and develop an efficient model updating method which can significantly speed up the active learning process. Experimental results on TRECVID collections highlight the effectiveness.
Topic segmentation via community detection in complex networks.
de Arruda, Henrique F; Costa, Luciano da F; Amancio, Diego R
2016-06-01
Many real systems have been modeled in terms of network concepts, and written texts are a particular example of information networks. In recent years, the use of network methods to analyze language has allowed the discovery of several interesting effects, including the proposition of novel models to explain the emergence of fundamental universal patterns. While syntactical networks, one of the most prevalent networked models of written texts, display both scale-free and small-world properties, such a representation fails in capturing other textual features, such as the organization in topics or subjects. We propose a novel network representation whose main purpose is to capture the semantical relationships of words in a simple way. To do so, we link all words co-occurring in the same semantic context, which is defined in a threefold way. We show that the proposed representations favor the emergence of communities of semantically related words, and this feature may be used to identify relevant topics. The proposed methodology to detect topics was applied to segment selected Wikipedia articles. We found that, in general, our methods outperform traditional bag-of-words representations, which suggests that a high-level textual representation may be useful to study the semantical features of texts.
Report on noninvasive prenatal testing: classical and alternative approaches.
Pantiukh, Kateryna S; Chekanov, Nikolay N; Zaigrin, Igor V; Zotov, Alexei M; Mazur, Alexander M; Prokhortchouk, Egor B
2016-01-01
Concerns of traditional prenatal aneuploidy testing methods, such as low accuracy of noninvasive and health risks associated with invasive procedures, were overcome with the introduction of novel noninvasive methods based on genetics (NIPT). These were rapidly adopted into clinical practice in many countries after a series of successful trials of various independent submethods. Here we present results of own NIPT trial carried out in Moscow, Russia. 1012 samples were subjected to the method aimed at measuring chromosome coverage by massive parallel sequencing. Two alternative approaches are ascertained: one based on maternal/fetal differential methylation and another based on allelic difference. While the former failed to provide stable results, the latter was found to be promising and worthy of conducting a large-scale trial. One critical point in any NIPT approach is the determination of fetal cell-free DNA fraction, which dictates the reliability of obtained results for a given sample. We show that two different chromosome Y representation measures-by real-time PCR and by whole-genome massive parallel sequencing-are practically interchangeable (r=0.94). We also propose a novel method based on maternal/fetal allelic difference which is applicable in pregnancies with fetuses of either sex. Even in its pilot form it correlates well with chromosome Y coverage estimates (r=0.74) and can be further improved by increasing the number of polymorphisms.
A Thermodynamically-consistent FBA-based Approach to Biogeochemical Reaction Modeling
NASA Astrophysics Data System (ADS)
Shapiro, B.; Jin, Q.
2015-12-01
Microbial rates are critical to understanding biogeochemical processes in natural environments. Recently, flux balance analysis (FBA) has been applied to predict microbial rates in aquifers and other settings. FBA is a genome-scale constraint-based modeling approach that computes metabolic rates and other phenotypes of microorganisms. This approach requires a prior knowledge of substrate uptake rates, which is not available for most natural microbes. Here we propose to constrain substrate uptake rates on the basis of microbial kinetics. Specifically, we calculate rates of respiration (and fermentation) using a revised Monod equation; this equation accounts for both the kinetics and thermodynamics of microbial catabolism. Substrate uptake rates are then computed from the rates of respiration, and applied to FBA to predict rates of microbial growth. We implemented this method by linking two software tools, PHREEQC and COBRA Toolbox. We applied this method to acetotrophic methanogenesis by Methanosarcina barkeri, and compared the simulation results to previous laboratory observations. The new method constrains acetate uptake by accounting for the kinetics and thermodynamics of methanogenesis, and predicted well the observations of previous experiments. In comparison, traditional methods of dynamic-FBA constrain acetate uptake on the basis of enzyme kinetics, and failed to reproduce the experimental results. These results show that microbial rate laws may provide a better constraint than enzyme kinetics for applying FBA to biogeochemical reaction modeling.
Visualization of postoperative anterior cruciate ligament reconstruction bone tunnels
2011-01-01
Background and purpose Non-anatomic bone tunnel placement is the most common cause of a failed ACL reconstruction. Accurate and reproducible methods to visualize and document bone tunnel placement are therefore important. We evaluated the reliability of standard radiographs, CT scans, and a 3-dimensional (3D) virtual reality (VR) approach in visualizing and measuring ACL reconstruction bone tunnel placement. Methods 50 consecutive patients who underwent single-bundle ACL reconstructions were evaluated postoperatively by standard radiographs, CT scans, and 3D VR images. Tibial and femoral tunnel positions were measured by 2 observers using the traditional methods of Amis, Aglietti, Hoser, Stäubli, and the method of Benereau for the VR approach. Results The tunnel was visualized in 50–82% of the standard radiographs and in 100% of the CT scans and 3D VR images. Using the intraclass correlation coefficient (ICC), the inter- and intraobserver agreement was between 0.39 and 0.83 for the standard femoral and tibial radiographs. CT scans showed an ICC range of 0.49–0.76 for the inter- and intraobserver agreement. The agreement in 3D VR was almost perfect, with an ICC of 0.83 for the femur and 0.95 for the tibia. Interpretation CT scans and 3D VR images are more reliable in assessing postoperative bone tunnel placement following ACL reconstruction than standard radiographs. PMID:21999625
Comparison of Traditional and Reverse Syphilis Screening Algorithms in Medical Health Checkups.
Nah, Eun Hee; Cho, Seon; Kim, Suyoung; Cho, Han Ik; Chai, Jong Yil
2017-11-01
The syphilis diagnostic algorithms applied in different countries vary significantly depending on the local syphilis epidemiology and other considerations, including the expected workload, the need for automation in the laboratory and budget factors. This study was performed to investigate the efficacy of traditional and reverse syphilis diagnostic algorithms during general health checkups. In total, 1,000 blood specimens were obtained from 908 men and 92 women during their regular health checkups. Traditional screening and reverse screening were applied to the same specimens using automatic rapid plasma regain (RPR) and Treponema pallidum latex agglutination (TPLA) tests, respectively. Specimens that were reverse algorithm (TPLA) reactive, were subjected to a second treponemal test performed by using the chemiluminescent microparticle immunoassay (CMIA). Of the 1,000 specimens tested, 68 (6.8%) were reactive by reverse screening (TPLA) compared with 11 (1.1%) by traditional screening (RPR). The traditional algorithm failed to detect 48 specimens [TPLA(+)/RPR(-)/CMIA(+)]. The median TPLA cutoff index (COI) was higher in CMIA-reactive cases than in CMIA-nonreactive cases (90.5 vs 12.5 U). The reverse screening algorithm could detect the subjects with possible latent syphilis who were not detected by the traditional algorithm. Those individuals could be provided with opportunities for evaluating syphilis during their health checkups. The COI values of the initial TPLA test may be helpful in excluding false-positive TPLA test results in the reverse algorithm. © The Korean Society for Laboratory Medicine
Attempts to utilize and integrate traditional medicine in North Korea.
Lim, Byungmook; Park, Jongbae; Han, Changyon
2009-03-01
To summarize the way North Korea attempted to modernize its system of traditional medicine and integrate it with Western biomedicine. We reviewed clinical textbooks and periodicals of traditional Korean medicine published in North Korea, research reports on North Korean health and medicine published elsewhere, and conducted interviews of defectors from North Korea who were students or clinicians of traditional medicine. Key findings of this study are: (1) North Korea has attempted several ways of integrating traditional medicine into education and clinical practices; (2) North Korea's communist government provided the main driving force for an integration policy; (3) school curricula of both Western and traditional Korean medicine incorporated knowledge of both disciplines, yet more weight was placed on traditional Korean medicine; (4) a combination of Western diagnosis and Korean therapeutics was the most frequent example of integration, while the dual system approach with reciprocal practice was also explored; (5) several forms of integrative therapeutic mixture were practiced including concurrent medication, injection on acupuncture points, and intramuscular or intravenous injection of extracts from medicinal plants; and (6) limited resources for research and the underdeveloped level of clinical research failed to secure rigorous scientific advancement. Despite the government-driven attempt to create an ideal integrative system of medicine, according to our findings, the actual introduction of an integrative system into practice was far from the North Korean government's anticipated outcome in regards to clinical practice. We hypothesize this was due to famine, economic crisis, and political isolation from the international realm. Traditional Korean medicine seems to have served the population, which is in desperate need of treatment amid difficulties in health, while North Korea's Western biomedicine-based health delivery system has been badly affected.
Study of fail-safe abort system for an actively cooled hypersonic aircraft, volume 2
NASA Technical Reports Server (NTRS)
Peeples, M. E.; Herring, R. L.
1976-01-01
Conceptual designs of a fail-safe abort system for hydrogen fueled actively cooled high speed aircraft are examined. The fail-safe concept depends on basically three factors: (1) a reliable method of detecting a failure or malfunction in the active cooling system, (2) the optimization of abort trajectories which minimize the descent heat load to the aircraft, and (3) fail-safe thermostructural concepts to minimize both the weight and the maximum temperature the structure will reach during descent. These factors are examined and promising approaches are evaluated based on weight, reliability, ease of manufacture and cost.
Salvage of infected total knee fusion: the last option.
Wiedel, Jerome D
2002-11-01
Currently the most common indication for an arthrodesis of the knee is a failed infected total knee prosthesis. Other causes of a failed total knee replacement that might necessitate a knee fusion include aseptic loosening, deficient extensor mechanism, poor soft tissues, and Charcot joint. Techniques available for achieving a knee fusion are external fixation and internal fixation methods. The external fixation compression devices have been the most widely used for knee fusion and have been successful until the indications for fusion changed to mostly failed prosthetic knee replacement. With failed total knee replacement, the problem of severe bone loss became an issue, and the external fixation compression devices, even including the biplane external fixators, have been the least successful method reported for gaining fusion. The Ilizarov technique has been shown to achieve rigid fixation despite this bone loss, and a review of reports are showing high fusion rates using this method. Internal fixation methods including plate fixation and intramedullary nails have had the best success in gaining fusion in the face of this bone loss and have replaced external fixation methods as the technique of choice for knee fusion when severe bone loss is present. A review of the literature and a discussion of different fusion techniques are presented including a discussion of the influence that infection has on the success of fusion.
Circular revisit orbits design for responsive mission over a single target
NASA Astrophysics Data System (ADS)
Li, Taibo; Xiang, Junhua; Wang, Zhaokui; Zhang, Yulin
2016-10-01
The responsive orbits play a key role in addressing the mission of Operationally Responsive Space (ORS) because of their capabilities. These capabilities are usually focused on supporting specific targets as opposed to providing global coverage. One subtype of responsive orbits is repeat coverage orbit which is nearly circular in most remote sensing applications. This paper deals with a special kind of repeating ground track orbit, referred to as circular revisit orbit. Different from traditional repeat coverage orbits, a satellite on circular revisit orbit can visit a target site at both the ascending and descending stages in one revisit cycle. This typology of trajectory allows a halving of the traditional revisit time and does a favor to get useful information for responsive applications. However the previous reported numerical methods in some references often cost lots of computation or fail to obtain such orbits. To overcome this difficulty, an analytical method to determine the existence conditions of the solutions to revisit orbits is presented in this paper. To this end, the mathematical model of circular revisit orbit is established under the central gravity model and the J2 perturbation. A constraint function of the circular revisit orbit is introduced, and the monotonicity of that function has been studied. The existent conditions and the number of such orbits are naturally worked out. Taking the launch cost into consideration, optimal design model of circular revisit orbit is established to achieve a best orbit which visits a target twice a day in the morning and in the afternoon respectively for several days. The result shows that it is effective to apply circular revisit orbits in responsive application such as reconnoiter of natural disaster.
Modeling instructor preferences for CPR and AED competence estimation.
Birnbaum, Alice; McBurnie, Mary Ann; Powell, Judy; Ottingham, Lois Van; Riegel, Barbara; Potts, Jerry; Hedges, Jerris R
2005-03-01
Cardiopulmonary resuscitation (CPR) and automated external defibrillator (AED) skills competency can be tested using a checklist of component skills, individually graded "pass" or "fail." Scores are typically calculated as the percentage of skills passed, but may differ from an instructor's overall subjective assessment of simulated CPR or AED adequacy. To identify and evaluate composite measures (methods for scoring checklists) that reflect instructors' subjective assessments of CPR or AED skills performance best. Associations between instructor assessment and lay-volunteer skill performance were made using 6380 CPR and 3313 AED skill retention tests collected in the Public Access Defibrillation Trial. Checklists included CPR skills (e.g., calling 911, administering compressions) and AED skills (e.g., positioning electrodes, shocking within 90 s of AED arrival). The instructor's subjective overall assessment (adequate/inadequate) of CPR performance (perfusion) or AED competence (effective shock) was compared to composite measures. We evaluated the traditional composite measure (assigning equal weights to individual skills) and several nontraditional composite measures (assigning variable weights). Skills performed out of sequence were further weighted from 0% (no credit) to 100% (full credit). Composite measures providing full credit for skills performed out of sequence and down-weighting process skills (e.g., calling 911, clearing oneself from the AED) had the strongest association with the instructor's subjective assessment; the traditional CPR composite measure had the weakest association. Our findings suggest that instructors in public CPR and AED classes may tend to down-weight process skills and to excuse step sequencing errors when evaluating CPR and AED skills subjectively for overall proficiency. Testing methods that relate classroom performance to actual performance in the field and to clinical outcomes require further research.
Lin, J P; Hirsch, R; Jacobsson, L T; Scott, W W; Ma, L D; Pillemer, S R; Knowler, W C; Kastner, D L; Bale, S J
1999-01-01
Due to the characteristics of complex traits, many traits may not be amenable to traditional epidemiologic methods. We illustrate an approach that defines an isolated population as the "unit" for carrying out studies of complex disease. We provide an example using the Pima Indians, a relatively isolated population, in which the incidence and prevalence of Type 2 diabetes, gallbladder disease, and rheumatoid arthritis (RA) are significantly increased compared with the general U.S. population. A previous study of RA in the Pima utilizing traditional methods failed to detect a genetic effect on the occurrence of the disease. Our approach involved constructing a genealogy for this population and using a genealogic index to investigate familial aggregation. We developed an algorithm to identify biological relationships among 88 RA cases versus 4,000 subsamples of age-matched individuals from the same population. Kinship coefficients were calculated for all possible pairs of RA cases, and similarly for the subsamples. The sum of the kinship coefficient among all combination of RA pairs, 5.92, was significantly higher than the average of the 4,000 subsamples, 1.99 (p < 0.001), and was elevated over that of the subsamples to the level of second cousin, supporting a genetic effect in the familial aggregation. The mean inbreeding coefficient for the Pima was 0.00009, similar to that reported for other populations; none of the RA cases were inbred. The Pima genealogy can be anticipated to provide valuable information for the genetic study of diseases other than RA. Defining an isolated population as the "unit" in which to assess familial aggregation may be advantageous, especially if there are a limited number of cases in the study population.
Leadership of Cyber Warriors: Enduring Principles and New Directions
2011-07-11
cyber warfare threat against the United States, the creation of United States Cyber Command and the designation of cyberspace as a warfighting domain now necessitate study of the attributes of successful cyber warfare leaders and the leadership techniques required to successfully lead cyber warriors. In particular, we must develop an understanding of where traditional kinetic leadership paradigms succeed, where they fail, and where new techniques must be adopted. Leadership is not a one size fits all endeavor. The capabilities and characteristics of
2011-04-29
religion , and Napoleon and his troops had wounded the Spaniards in their deepest beliefs and national pride. The Spanish had contempt for the invasion and...organizations with force, as the French attempted to do in Spain, must first fully ground themselves in knowledge of the culture, traditions, religion ...de la Independencia. Madrid: Leynfor Siglo XXI, 2007. The book is focused on one of the guerrilla leaders who became senior officer in the Spanish
pH sensor based on boron nitride nanotubes.
Huang, Q; Bando, Y; Zhao, L; Zhi, C Y; Golberg, D
2009-10-14
A submicrometer-sized pH sensor based on biotin-fluorescein-functionalized multiwalled BN nanotubes with anchored Ag nanoparticles is designed. Intrinsic pH-dependent photoluminescence and Raman signals in attached fluorescein molecules enhanced by Ag nanoparticles allow this novel nanohybrid to perform as a practical pH sensor. It is able to work in a submicrometer-sized space. For example, the sensor may determine the environmental pH of sub-units in living cells where a traditional optical fiber sensor fails because of spatial limitations.
pH sensor based on boron nitride nanotubes
NASA Astrophysics Data System (ADS)
Huang, Q.; Bando, Y.; Zhao, L.; Zhi, C. Y.; Golberg, D.
2009-10-01
A submicrometer-sized pH sensor based on biotin-fluorescein-functionalized multiwalled BN nanotubes with anchored Ag nanoparticles is designed. Intrinsic pH-dependent photoluminescence and Raman signals in attached fluorescein molecules enhanced by Ag nanoparticles allow this novel nanohybrid to perform as a practical pH sensor. It is able to work in a submicrometer-sized space. For example, the sensor may determine the environmental pH of sub-units in living cells where a traditional optical fiber sensor fails because of spatial limitations.
The admissible portfolio selection problem with transaction costs and an improved PSO algorithm
NASA Astrophysics Data System (ADS)
Chen, Wei; Zhang, Wei-Guo
2010-05-01
In this paper, we discuss the portfolio selection problem with transaction costs under the assumption that there exist admissible errors on expected returns and risks of assets. We propose a new admissible efficient portfolio selection model and design an improved particle swarm optimization (PSO) algorithm because traditional optimization algorithms fail to work efficiently for our proposed problem. Finally, we offer a numerical example to illustrate the proposed effective approaches and compare the admissible portfolio efficient frontiers under different constraints.
Data Compression With Application to Geo-Location
2010-08-01
wireless sensor network requires the estimation of time-difference-of-arrival (TDOA) parameters using data collected by a set of spatially separated sensors. Compressing the data that is shared among the sensors can provide tremendous savings in terms of the energy and transmission latency. Traditional MSE and perceptual based data compression schemes fail to accurately capture the effects of compression on the TDOA estimation task; therefore, it is necessary to investigate compression algorithms suitable for TDOA parameter estimation. This thesis explores the
Symmetry Beyond Perturbation Theory: Floppy Molecules and Rotation-Vibration States
NASA Astrophysics Data System (ADS)
Schmiedt, Hanno; Schlemmer, Stephan; Jensen, Per
2015-06-01
In the customary approach to the theoretical description of the nuclear motion in molecules, the molecule is seen as a near-static structure rotating in space. Vibrational motion causing small structural deformations induces a perturbative treatment of the rotation-vibration interaction, which fails in fluxional molecules, where all vibrational motions are large compared to the linear extension of the molecule. An example is protonated methane (CH_5^+). For this molecule, customary theory fails to simulate reliably even the low-energy spectrum. Within the traditional view of rotation and vibration being near-separable, rotational and vibrational wavefunctions can be symmetry classified separately in the molecular symmetry (MS) group. In the present contribution we discuss a fundamental group theoretical approach to the problem of determining the symmetries of molecular rotation-vibration states. We will show that all MS groups discussed so far are subgroups of the special orthogonal group in three dimensions SO(3) This leads to a group theoretical foundation of the technique of equivalent rotations. The MS group of protonated methane (G240) represents, to the best of our knowledge, the first example of an MS group which is not a subgroup of SO(3) (nor of O(3) nor of SU(2)). Because of this, a separate symmetry classification of vibrational and rotational wavefunctions becomes impossible in this MS group, consistent with the fact that a decoupling of vibrational and rotational motion is impossible. We want to discuss the consequences of this. In conclusion, we show that the prototypical floppy molecule CH_5^+ represents a new class of molecules, where usual group theoretical methods for determining selection rules and spectral assignments fail so that new methods have to be developed. P. Kumar and D. Marx, Physical Chemistry Chemical Physics 8, 573 (2006) Z. Jin, B. J. Braams, and J. M. Bowman, The Journal of Physical Chemistry A 110, 1569 (2006) A. S. Petit, J. E. Ford, and A. B. McCoy, The Journal of Physical Chemistry A 118, 7206 (2014). P.R. Bunker and P. Jensen, Molecular Symmetry and Spectroscopy (NRC Research Press, Ottawa, Canada, 1988 Being precise, we must include O(3) and SU(2), but our theory can be easily extended to these two groups. H. Longuet-Higgins, Molecular Physics 6, 445 (1963).
NASA Astrophysics Data System (ADS)
Herrera, D.; Bennadji, A.
2013-07-01
In order to achieve the CO2 reduction targets set by the Scottish government, it will be necessary to improve the energy efficiency of existing buildings. Within the total Scottish building stock, historic and traditionally constructed buildings are an important proportion, in the order of 19 % (Curtis, 2010), and represent cultural, emotional and identity values that should be protected. However, retrofit interventions could be a complex operation because of the several aspects that are involved in the hygrothermal performance of traditional buildings. Moreover, all these factors interact with each other and therefore need to be analysed as a whole. Upgrading the envelope of traditional buildings may produce severe changes to the moisture migration leading to superficial or interstitial condensation and thus fabric decay and mould growth. Retrofit projects carried out in the past have failed because of the misunderstanding, or the lack of expert prediction, of the potential consequences associated to the envelope's alteration. The evaluation of potential risks, prior to any alteration on building's physics in order to improve its energy efficiency, is critical to avoid future damage on the wall's performance or occupants' health and well being. The aim of this PhD research project is to point out the most critical aspects related to the energy efficiency improvement of traditional buildings and to develop a risk based methodology that helps owners and practitioners during the decision making process.
Thomas, Lisa R.; Donovan, Dennis M.; Sigo, Robin LW.; Austin, Lisette; Marlatt, G. Alan
2010-01-01
Alcohol and drug abuse are major areas of concern for many American Indian/Alaska Native communities. Research on these problems has often been less than successful, in part because many researchers are not sensitive to the culture and traditions of the tribes and communities with which they are working. They also often fail to incorporate tribal customs, traditions, and values into the interventions developed to deal with substance abuse. We describe the use of Community-Based Participatory Research (CBPR) and Tribal Participatory Research (TPR) approaches to develop a culturally sensitive substance abuse prevention program for Native youth. This project, The Community Pulling Together: Healing of the Canoe, is a collaboration between the Suquamish Tribe and the Alcohol and Drug Abuse Institute at the University of Washington. PMID:20157631
Strapdown cost trend study and forecast
NASA Technical Reports Server (NTRS)
Eberlein, A. J.; Savage, P. G.
1975-01-01
The potential cost advantages offered by advanced strapdown inertial technology in future commercial short-haul aircraft are summarized. The initial procurement cost and six year cost-of-ownership, which includes spares and direct maintenance cost were calculated for kinematic and inertial navigation systems such that traditional and strapdown mechanization costs could be compared. Cost results for the inertial navigation systems showed that initial costs and the cost of ownership for traditional triple redundant gimbaled inertial navigators are three times the cost of the equivalent skewed redundant strapdown inertial navigator. The net cost advantage for the strapdown kinematic system is directly attributable to the reduction in sensor count for strapdown. The strapdown kinematic system has the added advantage of providing a fail-operational inertial navigation capability for no additional cost due to the use of inertial grade sensors and attitude reference computers.
On explicit algebraic stress models for complex turbulent flows
NASA Technical Reports Server (NTRS)
Gatski, T. B.; Speziale, C. G.
1992-01-01
Explicit algebraic stress models that are valid for three-dimensional turbulent flows in noninertial frames are systematically derived from a hierarchy of second-order closure models. This represents a generalization of the model derived by Pope who based his analysis on the Launder, Reece, and Rodi model restricted to two-dimensional turbulent flows in an inertial frame. The relationship between the new models and traditional algebraic stress models -- as well as anistropic eddy visosity models -- is theoretically established. The need for regularization is demonstrated in an effort to explain why traditional algebraic stress models have failed in complex flows. It is also shown that these explicit algebraic stress models can shed new light on what second-order closure models predict for the equilibrium states of homogeneous turbulent flows and can serve as a useful alternative in practical computations.
Aragão, José Aderval; Freire, Marianna Ribeiro de Menezes; Nolasco Farias, Lucas Guimarães; Diniz, Sarah Santana; Sant'anna Aragão, Felipe Matheus; Sant'anna Aragão, Iapunira Catarina; Lima, Tarcisio Brandão; Reis, Francisco Prado
2018-06-01
To compare depressive symptoms among medical students taught using problem-based learning (PBL) and the traditional method. Beck's Depression Inventory was applied to 215 medical students. The prevalence of depression was calculated as the number of individuals with depression divided by the total number in the sample from each course, with 95% confidence intervals. The statistical significance level used was 5% (p ≤ .05). Among the 215 students, 52.1% were male and 47.9% were female; and 51.6% were being taught using PBL methodology and 48.4% using traditional methods. The prevalence of depression was 29.73% with PBL and 22.12% with traditional methods. There was higher prevalence among females: 32.8% with PBL and 23.1% with traditional methods. The prevalence of depression with PBL among students up to 21 years of age was 29.4% and among those over 21 years, 32.1%. With traditional methods among students up to 21 years of age, it was 16.7%%, and among those over 21 years, 30.1%. The prevalence of depression with PBL was highest among students in the second semester and with traditional methods, in the eighth. Depressive symptoms were highly prevalent among students taught both with PBL and with traditional methods.
Neuhauser, Linda; Kreps, Gary L
2014-12-01
Traditional communication theory and research methods provide valuable guidance about designing and evaluating health communication programs. However, efforts to use health communication programs to educate, motivate, and support people to adopt healthy behaviors often fail to meet the desired goals. One reason for this failure is that health promotion issues are complex, changeable, and highly related to the specific needs and contexts of the intended audiences. It is a daunting challenge to effectively influence health behaviors, particularly culturally learned and reinforced behaviors concerning lifestyle factors related to diet, exercise, and substance (such as alcohol and tobacco) use. Too often, program development and evaluation are not adequately linked to provide rapid feedback to health communication program developers so that important revisions can be made to design the most relevant and personally motivating health communication programs for specific audiences. Design science theory and methods commonly used in engineering, computer science, and other fields can address such program and evaluation weaknesses. Design science researchers study human-created programs using tightly connected build-and-evaluate loops in which they use intensive participatory methods to understand problems and develop solutions concurrently and throughout the duration of the program. Such thinking and strategies are especially relevant to address complex health communication issues. In this article, the authors explore the history, scientific foundation, methods, and applications of design science and its potential to enhance health communication programs and their evaluation.
Introduction to benchmark dose methods and U.S. EPA's benchmark dose software (BMDS) version 2.1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, J. Allen, E-mail: davis.allen@epa.gov; Gift, Jeffrey S.; Zhao, Q. Jay
2011-07-15
Traditionally, the No-Observed-Adverse-Effect-Level (NOAEL) approach has been used to determine the point of departure (POD) from animal toxicology data for use in human health risk assessments. However, this approach is subject to substantial limitations that have been well defined, such as strict dependence on the dose selection, dose spacing, and sample size of the study from which the critical effect has been identified. Also, the NOAEL approach fails to take into consideration the shape of the dose-response curve and other related information. The benchmark dose (BMD) method, originally proposed as an alternative to the NOAEL methodology in the 1980s, addressesmore » many of the limitations of the NOAEL method. It is less dependent on dose selection and spacing, and it takes into account the shape of the dose-response curve. In addition, the estimation of a BMD 95% lower bound confidence limit (BMDL) results in a POD that appropriately accounts for study quality (i.e., sample size). With the recent advent of user-friendly BMD software programs, including the U.S. Environmental Protection Agency's (U.S. EPA) Benchmark Dose Software (BMDS), BMD has become the method of choice for many health organizations world-wide. This paper discusses the BMD methods and corresponding software (i.e., BMDS version 2.1.1) that have been developed by the U.S. EPA, and includes a comparison with recently released European Food Safety Authority (EFSA) BMD guidance.« less
NASA Astrophysics Data System (ADS)
Jende, Phillipp; Nex, Francesco; Gerke, Markus; Vosselman, George
2018-07-01
Mobile Mapping (MM) solutions have become a significant extension to traditional data acquisition methods over the last years. Independently from the sensor carried by a platform, may it be laser scanners or cameras, high-resolution data postings are opposing a poor absolute localisation accuracy in urban areas due to GNSS occlusions and multipath effects. Potentially inaccurate position estimations are propagated by IMUs which are furthermore prone to drift effects. Thus, reliable and accurate absolute positioning on a par with MM's high-quality data remains an open issue. Multiple and diverse approaches have shown promising potential to mitigate GNSS errors in urban areas, but cannot achieve decimetre accuracy, require manual effort, or have limitations with respect to costs and availability. This paper presents a fully automatic approach to support the correction of MM imaging data based on correspondences with airborne nadir images. These correspondences can be employed to correct the MM platform's orientation by an adjustment solution. Unlike MM as such, aerial images do not suffer from GNSS occlusions, and their accuracy is usually verified by employing well-established methods using ground control points. However, a registration between MM and aerial images is a non-standard matching scenario, and requires several strategies to yield reliable and accurate correspondences. Scale, perspective and content strongly vary between both image sources, thus traditional feature matching methods may fail. To this end, the registration process is designed to focus on common and clearly distinguishable elements, such as road markings, manholes, or kerbstones. With a registration accuracy of about 98%, reliable tie information between MM and aerial data can be derived. Even though, the adjustment strategy is not covered in its entirety in this paper, accuracy results after adjustment will be presented. It will be shown that a decimetre accuracy is well achievable in a real data test scenario.
Is traditional contraceptive use in Moldova associated with poverty and isolation?
Lyons-Amos, Mark J; Durrant, Gabriele B; Padmadas, Sabu S
2011-05-01
This study investigates the correlates of traditional contraceptive use in Moldova, a poor country in Europe with one of the highest proportions of traditional contraceptive method users. The high reliance on traditional methods, particularly in the context of sub-replacement level fertility rate, has not been systematically evaluated in demographic research. Using cross-sectional data on a sub-sample of 6039 sexually experienced women from the 2005 Moldovan Demographic and Health Survey, this study hypothesizes that (a) economic and spatial disadvantages increase the likelihood of traditional method use, and (b) high exposure to family planning/reproductive health (FP/RH) programmes increases the propensity to modern method use. Multilevel multinomial models are used to examine the correlates of traditional method use controlling for exposure to sexual activity, socioeconomic and demographic characteristics and data structure. The results show that economic disadvantage increases the probability of traditional method use, but the overall effect is small. Although higher family planning media exposure decreases the reliance on traditional methods among younger women, it has only a marginal effect in increasing modern method use among older women. Family planning programmes designed to encourage women to switch from traditional to modern methods have some success--although the effect is considerably reduced in regions outside of the capital Chisinau. The study concludes that FP/RH efforts directed towards the poorest may have limited impact, but interventions targeted at older women could reduce the burden of unwanted pregnancies and abortions. Addressing differentials in accessing modern methods could improve uptake in rural areas.
Cheng, Gong; Huang, Lu-qi; Xue, Da-yuan; Zhang, Xiao-bo
2014-12-01
The survey of traditional knowledge related to Chinese materia medica resources is the important component and one of the innovative aspects of the fourth national survey of the Chinese materia medica resources. China has rich traditional knowledge of traditional Chinese medicine (TCM) and the comprehensive investigation of TCM traditional knowledge aims to promote conservation and sustainable use of Chinese materia medica resources. Building upon the field work of pilot investigations, this paper introduces the essential procedures and key methods for conducting the survey of traditional knowledge related to Chinese materia medica resources. The essential procedures are as follows. First is the preparation phrase. It is important to review all relevant literature and provide training to the survey teams so that they have clear understanding of the concept of traditional knowledge and master key survey methods. Second is the field investigation phrase. When conducting field investigations, survey teams should identify the traditional knowledge holders by using the 'snowball method', record the traditional knowledge after obtaining prior informed concerned from the traditional knowledge holders. Researchers should fill out the survey forms provided by the Technical Specification of the Fourth National Survey of Chinese Materia Medica Resources. Researchers should pay particular attention to the scope of traditional knowledge and the method of inheriting the knowledge, which are the key information for traditional knowledge holders and potential users to reach mutual agreed terms to achieve benefit sharing. Third is the data compilation and analysis phrase. Researchers should try to compile and edit the TCM traditional knowledge in accordance with intellectual property rights requirements so that the information collected through the national survey can serve as the basic data for the TCM traditional knowledge database. The key methods of the survey include regional division of Chinese materia medica resources, interview of key information holders and standardization of information.' In particular, using "snowball method" can effectively identify traditional knowledge holder in the targeted regions and ensuring traditional knowledge holders receiving prior informed concerned before sharing the information with researcher to make sure the rights of traditional knowledge holders are protected. Employing right survey methods is not only the key to obtain traditional knowledge related to Chinese materia medica resources, but also the pathway to fulfill the objectives of access and benefit sharing stipulated in Convention on Biological Resources. It will promote the legal protection of TCM traditional knowledge and conservation of TCM intangible, cultural heritage.
Modified sugar adulteration test applied to New Zealand honey.
Frew, Russell; McComb, Kiri; Croudis, Linda; Clark, Dianne; Van Hale, Robert
2013-12-15
The carbon isotope method (AOAC 998.12) compares the bulk honey carbon isotope value with that of the extracted protein; a difference greater than 1‰ suggesting that the protein and the bulk carbohydrate have different origins. New Zealand Manuka honey is a high value product and often fails this test. It has been suggested such failures are due to the pollen in the Manuka honey and an adaptation of the method to remove pollen prior to testing has been proposed. Here we test 64 authentic honey samples collected directly from the hives and find that a large proportion (37%) of Manuka honeys fail the test. Of these 60% still fail the adapted method. These honey samples were collected and processed under stringent conditions and have not been adulterated post-harvest. More work is required to ascertain the cause of these test failures. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Pirello, C. J.; Herring, R. L.
1976-01-01
Conceptual designs of a fail-safe abort system for hydrogen fueled actively cooled high speed aircraft are examined. The fail-safe concept depends on basically three factors: (1) a reliable method of detecting a failure or malfunction in the active cooling system, (2) the optimization of abort trajectories which minimize the descent heat load to the aircraft, and (3) fail-safe thermostructural concepts to minimize both the weight and the maximum temperature the structure will reach during descent. These factors are examined and promising approaches are evaluated based on weight, reliability, ease of manufacture and cost.
Non-traditional stable isotope behaviors in immiscible silica-melts in a mafic magma chamber.
Zhu, Dan; Bao, Huiming; Liu, Yun
2015-12-01
Non-traditional stable isotopes have increasingly been applied to studies of igneous processes including planetary differentiation. Equilibrium isotope fractionation of these elements in silicates is expected to be negligible at magmatic temperatures (δ(57)Fe difference often less than 0.2 per mil). However, an increasing number of data has revealed a puzzling observation, e.g., the δ(57)Fe for silicic magmas ranges from 0‰ up to 0.6‰, with the most positive δ(57)Fe almost exclusively found in A-type granitoids. Several interpretations have been proposed by different research groups, but these have so far failed to explain some aspects of the observations. Here we propose a dynamic, diffusion-induced isotope fractionation model that assumes Si-melts are growing and ascending immiscibly in a Fe-rich bulk magma chamber. Our model offers predictions on the behavior of non-traditional stable isotope such as Fe, Mg, Si, and Li that are consistent with observations from many A-type granitoids, especially those associated with layered intrusions. Diffusion-induced isotope fractionation may be more commonly preserved in magmatic rocks than was originally predicted.
Algorithms Bridging Quantum Computation and Chemistry
NASA Astrophysics Data System (ADS)
McClean, Jarrod Ryan
The design of new materials and chemicals derived entirely from computation has long been a goal of computational chemistry, and the governing equation whose solution would permit this dream is known. Unfortunately, the exact solution to this equation has been far too expensive and clever approximations fail in critical situations. Quantum computers offer a novel solution to this problem. In this work, we develop not only new algorithms to use quantum computers to study hard problems in chemistry, but also explore how such algorithms can help us to better understand and improve our traditional approaches. In particular, we first introduce a new method, the variational quantum eigensolver, which is designed to maximally utilize the quantum resources available in a device to solve chemical problems. We apply this method in a real quantum photonic device in the lab to study the dissociation of the helium hydride (HeH+) molecule. We also enhance this methodology with architecture specific optimizations on ion trap computers and show how linear-scaling techniques from traditional quantum chemistry can be used to improve the outlook of similar algorithms on quantum computers. We then show how studying quantum algorithms such as these can be used to understand and enhance the development of classical algorithms. In particular we use a tool from adiabatic quantum computation, Feynman's Clock, to develop a new discrete time variational principle and further establish a connection between real-time quantum dynamics and ground state eigenvalue problems. We use these tools to develop two novel parallel-in-time quantum algorithms that outperform competitive algorithms as well as offer new insights into the connection between the fermion sign problem of ground states and the dynamical sign problem of quantum dynamics. Finally we use insights gained in the study of quantum circuits to explore a general notion of sparsity in many-body quantum systems. In particular we use developments from the field of compressed sensing to find compact representations of ground states. As an application we study electronic systems and find solutions dramatically more compact than traditional configuration interaction expansions, offering hope to extend this methodology to challenging systems in chemical and material design.
Setting and validating the pass/fail score for the NBDHE.
Tsai, Tsung-Hsun; Dixon, Barbara Leatherman
2013-04-01
This report describes the overall process used for setting the pass/fail score for the National Board Dental Hygiene Examination (NBDHE). The Objective Standard Setting (OSS) method was used for setting the pass/fail score for the NBDHE. The OSS method requires a panel of experts to determine the criterion items and proportion of these items that minimally competent candidates would answer correctly, the percentage of mastery and the confidence level of the error band. A panel of 11 experts was selected by the Joint Commission on National Dental Examinations (Joint Commission). Panel members represented geographic distribution across the U.S. and had the following characteristics: full-time dental hygiene practitioners with experience in areas of preventive, periodontal, geriatric and special needs care, and full-time dental hygiene educators with experience in areas of scientific basis for dental hygiene practice, provision of clinical dental hygiene services and community health/research principles. Utilizing the expert panel's judgments, the pass/fail score was set and then the score scale was established using the Rasch measurement model. Statistical and psychometric analysis shows the actual failure rate and the OSS failure rate are reasonably consistent (2.4% vs. 2.8%). The analysis also showed the lowest error of measurement, an index of the precision at the pass/fail score point and that the highest reliability (0.97) are achieved at the pass/fail score point. The pass/fail score is a valid guide for making decisions about candidates for dental hygiene licensure. This new standard was reviewed and approved by the Joint Commission and was implemented beginning in 2011.
Cognitive balanced model: a conceptual scheme of diagnostic decision making.
Lucchiari, Claudio; Pravettoni, Gabriella
2012-02-01
Diagnostic reasoning is a critical aspect of clinical performance, having a high impact on quality and safety of care. Although diagnosis is fundamental in medicine, we still have a poor understanding of the factors that determine its course. According to traditional understanding, all information used in diagnostic reasoning is objective and logically driven. However, these conditions are not always met. Although we would be less likely to make an inaccurate diagnosis when following rational decision making, as described by normative models, the real diagnostic process works in a different way. Recent work has described the major cognitive biases in medicine as well as a number of strategies for reducing them, collectively called debiasing techniques. However, advances have encountered obstacles in achieving implementation into clinical practice. While traditional understanding of clinical reasoning has failed to consider contextual factors, most debiasing techniques seem to fail in raising sound and safer medical praxis. Technological solutions, being data driven, are fundamental in increasing care safety, but they need to consider human factors. Thus, balanced models, cognitive driven and technology based, are needed in day-to-day applications to actually improve the diagnostic process. The purpose of this article, then, is to provide insight into cognitive influences that have resulted in wrong, delayed or missed diagnosis. Using a cognitive approach, we describe the basis of medical error, with particular emphasis on diagnostic error. We then propose a conceptual scheme of the diagnostic process by the use of fuzzy cognitive maps. © 2011 Blackwell Publishing Ltd.
Myers, T J; Kytömaa, H K; Smith, T R
2007-04-11
Fiberglass reinforced plastic (FRP) composite materials are often used to construct tanks, piping, scrubbers, beams, grating, and other components for use in corrosive environments. While FRP typically offers superior and cost effective corrosion resistance relative to other construction materials, the glass fibers traditionally used to provide the structural strength of the FRP can be susceptible to attack by the corrosive environment. The structural integrity of traditional FRP components in corrosive environments is usually dependent on the integrity of a corrosion-resistant barrier, such as a resin-rich layer containing corrosion resistant glass fibers. Without adequate protection, FRP components can fail under loads well below their design by an environmental stress-corrosion cracking (ESCC) mechanism when simultaneously exposed to mechanical stress and a corrosive chemical environment. Failure of these components can result in significant releases of hazardous substances into plants and the environment. In this paper, we present two case studies where fiberglass components failed due to ESCC at small chemical manufacturing facilities. As is often typical, the small chemical manufacturing facilities relied largely on FRP component suppliers to determine materials appropriate for the specific process environment and to repair damaged in-service components. We discuss the lessons learned from these incidents and precautions companies should take when interfacing with suppliers and other parties during the specification, design, construction, and repair of FRP components in order to prevent similar failures and chemical releases from occurring in the future.
A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands
NASA Astrophysics Data System (ADS)
Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.
2009-12-01
Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.
Basu, Sanjay
2002-01-01
Although malaria is a growing problem affecting several hundred million people each year, many malarial countries lack successful disease control programs. Worldwide malaria incidence rates are dramatically increasing, generating fear among many people who are witnessing malaria control initiatives fail. In this paper, we explore two options for malaria control in poor countries: (1) the production and distribution of a malaria vaccine and (2) the control of mosquitoes that harbor the malaria parasite. We first demonstrate that the development of a malaria vaccine is indeed likely, although it will take several years to produce because of both biological obstacles and insufficient research support. The distribution of such a vaccine, as suggested by some economists, will require that wealthy states promise a market to pharmaceutical companies who have traditionally failed to investigate diseases affecting the poorest of nations. But prior to the development of a malaria vaccine, we recommend the implementation of vector control pro- grams, such as those using Bti toxin, in regions with low vector capacity. Our analysis indicates that both endogenous programs in malarial regions and molecular approaches to parasite control will provide pragmatic solutions to the malaria problem. But the successful control of malaria will require sustained support from wealthy nations, without whom vaccine development and vector control programs will likely fail.
The theory of multiple stupidities: education, technology and organisation in Arabia.
Al Lily, Abdulrahman Essa; Alhazmi, Ahmed Ali; Alzahrani, Saleh
2017-11-01
Traditional perspectives have envisaged intelligence as one entity dominated by a single set of abilities (i.e. cognitive abilities), whereas modern perspectives have defined intelligence in various shapes (e.g. linguistic, musical and interpersonal intelligences). By the same token, traditional perspectives have examined stupidity as one set of inabilities (i.e. cognitive inabilities). However, it is not clear whether modern perspectives have discussed whether stupidity exists in various forms-in the same way as they have envisaged intelligence. To address this limitation, 257 university members were asked to share what they perceived as being stupid educational and technological practices in their institutions. Analysis of the data suggested three concepts were important to the members: moral, spatial and administrative stupidities. That is, stupidity is perceived to come in the form of failing to meet certain moral, spatial and administrative values. This implies that modern perspectives may conceptualise stupidity differently from traditional perspectives, seeing it as going beyond cognitive inabilities and viewing it as existing in various forms (e.g. moral, spatial and administrative stupidities). Thus, there are multiple stupidities as there are multiple forms of intelligence. A strength of this research is that it views stupidity through an organisational and qualitative lens, although some may traditionally expect such a topic to be examined quantitatively through psychometric and biological approaches.
The reliability of the pass/fail decision for assessments comprised of multiple components.
Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana
2015-01-01
The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When "conjunctively" combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg's Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached - for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements.
Fernández-Santander, Ana
2008-01-01
The informal activities of cooperative learning and short periods of lecturing has been combined and used in the university teaching of biochemistry as part of the first year course of Optics and Optometry in the academic years 2004-2005 and 2005-2006. The lessons were previously elaborated by the teacher and included all that is necessary to understand the topic (text, figures, graphics, diagrams, pictures, etc.). Additionally, a questionnaire was prepared for every chapter. All lessons contained three parts: objectives, approach and development, and the assessment of the topic. Team work, responsibility, and communication skills were some of the abilities developed with this new methodology. Students worked collaboratively in small groups of two or three following the teacher's instructions with short periods of lecturing that clarified misunderstood concepts. Homework was minimized. On comparing this combined methodology with the traditional one (only lecture), students were found to exhibit a higher satisfaction with the new method. They were more involved in the learning process and had a better attitude toward the subject. The use of this new methodology showed a significant increase in the mean score of the students' academic results. The rate of students who failed the subject was significantly inferior in comparison with those who failed in the previous years when only lecturing was applied. This combined methodology helped the teacher to observe the apprenticeship process of students better and to act as a facilitator in the process of building students' knowledge. Copyright © 2008 International Union of Biochemistry and Molecular Biology, Inc.
Two-and-a-half-year-olds succeed at a traditional false-belief task with reduced processing demands
Scott, Rose M.; Baillargeon, Renée
2016-01-01
When tested with traditional false-belief tasks, which require answering a standard question about the likely behavior of an agent with a false belief, children perform below chance until age 4 y or later. When tested without such questions, however, children give evidence of false-belief understanding much earlier. Are traditional tasks difficult because they tap a more advanced form of false-belief understanding (fundamental-change view) or because they impose greater processing demands (processing-demands view)? Evidence that young children succeed at traditional false-belief tasks when processing demands are reduced would support the latter view. In prior research, reductions in inhibitory-control demands led to improvements in young children’s performance, but often only to chance (instead of below-chance) levels. Here we examined whether further reductions in processing demands might lead to success. We speculated that: (i) young children could respond randomly in a traditional low-inhibition task because their limited information-processing resources are overwhelmed by the total concurrent processing demands in the task; and (ii) these demands include those from the response-generation process activated by the standard question. This analysis suggested that 2.5-y-old toddlers might succeed at a traditional low-inhibition task if response-generation demands were also reduced via practice trials. As predicted, toddlers performed above chance following two response-generation practice trials; toddlers failed when these trials either were rendered less effective or were used in a high-inhibition task. These results support the processing-demands view: Even toddlers succeed at a traditional false-belief task when overall processing demands are reduced. PMID:27821728
Morgenstern, Hai; Rafaely, Boaz
2018-02-01
Spatial analysis of room acoustics is an ongoing research topic. Microphone arrays have been employed for spatial analyses with an important objective being the estimation of the direction-of-arrival (DOA) of direct sound and early room reflections using room impulse responses (RIRs). An optimal method for DOA estimation is the multiple signal classification algorithm. When RIRs are considered, this method typically fails due to the correlation of room reflections, which leads to rank deficiency of the cross-spectrum matrix. Preprocessing methods for rank restoration, which may involve averaging over frequency, for example, have been proposed exclusively for spherical arrays. However, these methods fail in the case of reflections with equal time delays, which may arise in practice and could be of interest. In this paper, a method is proposed for systems that combine a spherical microphone array and a spherical loudspeaker array, referred to as multiple-input multiple-output systems. This method, referred to as modal smoothing, exploits the additional spatial diversity for rank restoration and succeeds where previous methods fail, as demonstrated in a simulation study. Finally, combining modal smoothing with a preprocessing method is proposed in order to increase the number of DOAs that can be estimated using low-order spherical loudspeaker arrays.
Jang, Soobin; Park, Sunju; Jang, Bo-Hyoung; Park, Yu Lee; Lee, Ju Ah; Cho, Chung-Sik; Go, Ho-Yeon; Shin, Yong Cheol; Ko, Seong-Gyu
2017-01-01
Introduction Nicotine dependence is a disease, and tobacco use is related to 6 million deaths annually worldwide. Recently, in many countries, there has been growing interest in the use of traditional and complementary medicine (T&CM) methods, especially acupuncture, as therapeutic interventions for smoking cessation. The aim of this pilot study is to investigate the effectiveness of T&CM interventions on smoking cessation. Methods and analysis The STOP (Stop Tobacco Programme using traditional Korean medicine) study is designed to be a pragmatic, open-label, randomised pilot trial. This trial will evaluate whether adding T&CM methods (ie, ear and body acupuncture, aromatherapy) to conventional cessation methods (ie, nicotine replacement therapy (NRT), counselling) increases smoking cessation rates. Forty participants over 19 years old who are capable of communicating in Korean will be recruited. They will be current smokers who meet one of the following criteria: (1) smoke more than 10 cigarettes a day, (2) smoke less than 10 cigarettes a day and previously failed to cease smoking, or (3) smoke fewer than 10 cigarettes a day and have a nicotine dependence score (Fagerstrom Test for Nicotine Dependence) of 4 points or more. The trial will consist of 4 weeks of treatment and a 20 week follow-up period. A statistician will perform the statistical analyses for both the intention-to-treat (all randomly assigned participants) and per-protocol (participants who completed the trial without any protocol deviations) data using SAS 9.1.3. Ethics and dissemination This study has been approved by the Institutional Review Board (IRB) of the Dunsan Korean Medicine Hospital of Daejeon University (IRB reference no: DJDSKH-15-BM-11–1, Protocol No. version 4.1.).The protocol will be reapproved by IRB if it requires amendment. The trial will be conducted according to the Declaration of Helsinki, 7th version (2013). This study is designed to minimise the risk to participants, and the investigators will explain the study to the participants in detail. As an ethical clinical trial, the control group will also be given conventional cessation treatments, including NRT and counselling. Participants will be screened and provided with a registration number to protect their personal information. Informed consent will be obtained from the participants prior to enrolling them in the trial. Participants will be allowed to withdraw at anytime without penalty. Trial registration number ClinicalTrials.gov (NCT02768025); pre-results. PMID:28576892
Wijerathne, Buddhika; Rathnayake, Geetha
2013-01-01
Background Most universities currently practice traditional practical spot tests to evaluate students. However, traditional methods have several disadvantages. Computer-based examination techniques are becoming more popular among medical educators worldwide. Therefore incorporating the computer interface in practical spot testing is a novel concept that may minimize the shortcomings of traditional methods. Assessing students’ attitudes and perspectives is vital in understanding how students perceive the novel method. Methods One hundred and sixty medical students were randomly allocated to either a computer-based spot test (n=80) or a traditional spot test (n=80). The students rated their attitudes and perspectives regarding the spot test method soon after the test. The results were described comparatively. Results Students had higher positive attitudes towards the computer-based practical spot test compared to the traditional spot test. Their recommendations to introduce the novel practical spot test method for future exams and to other universities were statistically significantly higher. Conclusions The computer-based practical spot test is viewed as more acceptable to students than the traditional spot test. PMID:26451213
Tests of measurement invariance failed to support the application of the "then-test".
Nolte, Sandra; Elsworth, Gerald R; Sinclair, Andrew J; Osborne, Richard H
2009-11-01
The use of then-test (retrospective pre-test) scores has frequently been proposed as a solution to potential confounding of change scores because of response shift, as it is assumed that then-test and post-test responses are provided from the same perspective. However, this assumption has not been formally tested using robust quantitative methods. The aim of this study was to compare the psychometric performance of then-test/post-test with traditional pre-test/post-test data and assessing whether the resulting data structures support the application of the then-test for evaluations of chronic disease self-management interventions. Pre-test, post-test, and then-test data were collected from 314 participants of self-management courses using the Health Education Impact Questionnaire (heiQ). The derived change scores (pre-test/post-test; then-test/post-test) were examined for their psychometric performance using tests of measurement invariance. Few questionnaire items were noninvariant across pre-test/post-test, with four items identified and requiring removal to enable an unbiased comparison of factor means. In contrast, 12 items were identified and required removal in then-test/post-test data to avoid biased change score estimates. Traditional pre-test/post-test data appear to be robust with little indication of response shift. In contrast, the weaker psychometric performance of then-test/post-test data suggests psychometric flaws that may be the result of implicit theory of change, social desirability, and recall bias.
Pope, Zachary; Lee, Jung Eun; Gao, Zan
2018-01-01
Objective: Although current evidence supports the use of virtual reality (VR) in the treatment of mental disorders, it is unknown whether VR exercise would be beneficial to mental health. This review synthesized literature concerning the effect of VR exercise on anxiety and depression among various populations. Methods: Ten electronic databases were searched for studies on this topic from January 2000 through October 2017. Studies were eligible if the article: (1) was peer-reviewed; (2) was published in English; and (3) used quantitative measures in assessing anxiety- and depression-related outcomes. Results: A total of five empirical studies met the eligibility criteria. These studies included two randomized clinical trials, one control trial, and two cross-sectional studies. Four studies reported significant improvements in anxiety- and depression-related measures following VR exercise, including reduced tiredness and tension, in addition to increased energy and enjoyment. Nonetheless, one study failed to support the effectiveness of VR exercise over traditional exercise alone on depressive symptoms. Conclusions: Findings favor VR exercise in alleviating anxiety and depression symptomology. However, existing evidence is insufficient to support the advantages of VR exercise as a standalone treatment over traditional therapy in the alleviation of anxiety and depression given the paucity of studies, small sample sizes, and lack of high-quality research designs. Future studies may build upon these limitations to discern the optimal manner by which to employ VR exercise in clinical settings. PMID:29510528
Noguchi, Yoshinori; Matsui, Kunihiko; Imura, Hiroshi; Kiyota, Masatomo; Fukui, Tsuguya
2004-05-01
Quite often medical students or novice residents have difficulty in ruling out diseases even though they are quite unlikely and, due to this difficulty, such students and novice residents unnecessarily repeat laboratory or imaging tests. To explore whether or not a carefully designed short training course teaching Bayesian probabilistic thinking improves the diagnostic ability of medical students. Ninety students at 2 medical schools were presented with clinical scenarios of coronary artery disease corresponding to high, low, and intermediate pretest probabilities. The students' estimates of test characteristics of exercise stress test, and pretest and posttest probability for each scenario were evaluated before and after the short course. The pretest probability estimates by the students, as well as their proficiency in applying Bayes's theorem, were improved in the high pretest probability scenario after the short course. However, estimates of pretest probability in the low pretest probability scenario, and their proficiency in applying Bayes's theorem in the intermediate and low pretest probability scenarios, showed essentially no improvement. A carefully designed, but traditionally administered, short course could not improve the students' abilities in estimating pretest probability in a low pretest probability setting, and subsequently students remained incompetent in ruling out disease. We need to develop educational methods that cultivate a well-balanced clinical sense to enable students to choose a suitable diagnostic strategy as needed in a clinical setting without being one-sided to the "rule-in conscious paradigm."
48 CFR 2415.304 - Evaluation factors.
Code of Federal Regulations, 2010 CFR
2010-10-01
... DEVELOPMENT CONTRACTING METHODS AND CONTRACTING TYPES CONTRACTING BY NEGOTIATION Source Selection 2415.304... assigned a numerical weight (except for pass-fail factors) which shall appear in the RFP. When using LPTA, each evaluation factor is applied on a “pass-fail” basis; numerical scores are not assigned. “Pass-fail...
Productive Failure in STEM Education
ERIC Educational Resources Information Center
Trueman, Rebecca J.
2014-01-01
Science education is criticized because it often fails to support problem-solving skills in students. Instead, the instructional methods primarily emphasize didactic models that fail to engage students and reveal how the material can be applied to solve real problems. To overcome these limitations, this study asked participants in a general…
Detection, Isolation, and Identification of Vibrio cholerae from the Environment
Huq, Anwar; Haley, Bradd J.; Taviani, Elisa; Chen, Arlene; Hasan, Nur A.; Colwell, Rita R.
2012-01-01
Recent molecular advances in microbiology have greatly improved the detection of bacterial pathogens in the environment. Improvement and a downward trend in the cost of molecular detection methods have contributed to increased frequency of detection of pathogenic microorganisms where traditional culture-based detection methods have failed. Culture methods also have been greatly improved and the confluence of the two suites of methods provides a powerful tool for detection, isolation, and characterization of pathogens. While molecular detection provides data on the presence and type of pathogens, culturing methods allow a researcher to preserve the organism of interest for “–omics” studies, such as genomic, metabolomic, secretomic, and transcriptomic analysis, which are rapidly becoming more affordable. This has yielded a clearer understanding of the ecology and epidemiology of microorganisms that cause disease. Specifically, important advances have been made over the past several years on isolation, detection, and identification of Vibrio cholerae, the causative agent of cholera in humans. In this unit, we present commonly accepted methods for isolation, detection, and characterization of V. cholerae, providing more extensive knowledge of the ecology and epidemiology of this organism. This unit has been fully revised and updated from the earlier unit (Huq, Grim et al. 2006) with the latest knowledge and additional information not previously included. We have also taken into account of cost of reagents and equipment that may be prohibitive for many researchers and have, therefore, included protocols for all laboratories, including those with limited resources, likely to be located in regions of cholera endemicity. PMID:22875567
NASA Astrophysics Data System (ADS)
Nithiananthan, S.; Uneri, A.; Schafer, S.; Mirota, D.; Otake, Y.; Stayman, J. W.; Zbijewski, W.; Khanna, A. J.; Reh, D. D.; Gallia, G. L.; Siewerdsen, J. H.
2013-03-01
Fast, accurate, deformable image registration is an important aspect of image-guided interventions. Among the factors that can confound registration is the presence of additional material in the intraoperative image - e.g., contrast bolus or a surgical implant - that was not present in the prior image. Existing deformable registration methods generally fail to account for tissue excised between image acquisitions and typically simply "move" voxels within the images with no ability to account for tissue that is removed or introduced between scans. We present a variant of the Demons algorithm to accommodate such content mismatch. The approach combines segmentation of mismatched content with deformable registration featuring an extra pseudo-spatial dimension representing a reservoir from which material can be drawn into the registered image. Previous work tested the registration method in the presence of tissue excision ("missing tissue"). The current paper tests the method in the presence of additional material in the target image and presents a general method by which either missing or additional material can be accommodated. The method was tested in phantom studies, simulations, and cadaver models in the context of intraoperative cone-beam CT with three examples of content mismatch: a variable-diameter bolus (contrast injection); surgical device (rod), and additional material (bone cement). Registration accuracy was assessed in terms of difference images and normalized cross correlation (NCC). We identify the difficulties that traditional registration algorithms encounter when faced with content mismatch and evaluate the ability of the proposed method to overcome these challenges.
1983-10-01
12 FEB 83 2942* NR 4 SM PUMP DRAWS EXCESSIVE CURRENT 26 FEB 83 2973 NR 3 SWPUMIPINOP 08 MAR 83 3021* NR 1 SM PUMP PRESSURE SWITCH FAILED 29 MIAR 83...VLV FAILED 16 JUN 83 PHM 2 0392 NR 2 S PUMP LOW PRESSURE SWITCH INOP M1 MAR 83 2/18/83 0452 ALL SW PUMPS HAVE INACCESSIBLE ZINCS 27 MAR 83 0462 NR 2 SW...PUMP LOW PRESSURE SWITCH INOP 29 MAR 83 0512 NR 3 SW PUMP CHECK ,1V FAILED 28 APR 83 06 NR 2 SW PUMP SEAL FAILED 30 JUN 83 P-- 93 675* NR S1 S PUMP
A Complex-Valued Firing-Rate Model That Approximates the Dynamics of Spiking Networks
Schaffer, Evan S.; Ostojic, Srdjan; Abbott, L. F.
2013-01-01
Firing-rate models provide an attractive approach for studying large neural networks because they can be simulated rapidly and are amenable to mathematical analysis. Traditional firing-rate models assume a simple form in which the dynamics are governed by a single time constant. These models fail to replicate certain dynamic features of populations of spiking neurons, especially those involving synchronization. We present a complex-valued firing-rate model derived from an eigenfunction expansion of the Fokker-Planck equation and apply it to the linear, quadratic and exponential integrate-and-fire models. Despite being almost as simple as a traditional firing-rate description, this model can reproduce firing-rate dynamics due to partial synchronization of the action potentials in a spiking model, and it successfully predicts the transition to spike synchronization in networks of coupled excitatory and inhibitory neurons. PMID:24204236
Access to health care and equal protection of the law: the need for a new heightened scrutiny.
Mariner, W K
1986-01-01
Proposals to reduce national expenditures for health care under Medicare and other programs raise questions about the limits on legislative power to distribute health care benefits. The constitutional guarantee of equal protection has been a weak source of protection for the sick, largely because they fail to qualify for special scrutiny under traditional equal protection analysis. Recent decisions of the United States Supreme Court suggest that the Justices seek a newer, more flexible approach to reviewing claims of unequal protection. This Article examines the application of the equal protection guarantee to health-related claims. It argues that traditional equal protection analysis is too rigid and newer rationality review too imprecise to provide just eligibility determinations. The Article concludes that courts should subject claims of unequal protection in the health care context to heightened scrutiny, as health care plays a special role in assuring equality of opportunity.
The behavioral economics of health and health care.
Rice, Thomas
2013-01-01
People often make decisions in health care that are not in their best interest, ranging from failing to enroll in health insurance to which they are entitled, to engaging in extremely harmful behaviors. Traditional economic theory provides a limited tool kit for improving behavior because it assumes that people make decisions in a rational way, have the mental capacity to deal with huge amounts of information and choice, and have tastes endemic to them and not open to manipulation. Melding economics with psychology, behavioral economics acknowledges that people often do not act rationally in the economic sense. It therefore offers a potentially richer set of tools than provided by traditional economic theory to understand and influence behaviors. Only recently, however, has it been applied to health care. This article provides an overview of behavioral economics, reviews some of its contributions, and shows how it can be used in health care to improve people's decisions and health.