Switching between simple cognitive tasks: the interaction of top-down and bottom-up factors
NASA Technical Reports Server (NTRS)
Ruthruff, E.; Remington, R. W.; Johnston, J. C.
2001-01-01
How do top-down factors (e.g., task expectancy) and bottom-up factors (e.g., task recency) interact to produce an overall level of task readiness? This question was addressed by factorially manipulating task expectancy and task repetition in a task-switching paradigm. The effects of expectancy and repetition on response time tended to interact underadditively, but only because the traditional binary task-repetition variable lumps together all switch trials, ignoring variation in task lag. When the task-recency variable was scaled continuously, all 4 experiments instead showed additivity between expectancy and recency. The results indicated that expectancy and recency influence different stages of mental processing. One specific possibility (the configuration-execution model) is that task expectancy affects the time required to configure upcoming central operations, whereas task recency affects the time required to actually execute those central operations.
18 CFR 1301.5 - Timing of responses to requests.
Code of Federal Regulations, 2010 CFR
2010-04-01
... expect that the decision on disclosure will be as time consuming as for requests in Tract 3. (3) Tract 3. Requests which require a decision or input from another office or agency, extensive submitter notifications... expected to pose an imminent threat to the life or physical safety of an individual; (ii) An urgency to...
Ten Tips for Using Co-Planning Time More Efficiently
ERIC Educational Resources Information Center
Murawski, Wendy W.
2012-01-01
In this era of collaboration, educators are frequently expected to co-plan with one another on a regular basis. Unfortunately, the expectation of co-planning is not often accompanied by the time required or by the strategies necessary to plan effectively and efficiently for the inclusive classroom. This article provides 10 concrete tips for…
ERIC Educational Resources Information Center
Johnson, Gerald
2016-01-01
With the increase in technology in all facets of our lives and work, there is an ever increasing set of expectations that people have regarding information availability, response time, and dependability. While expectations are affected by gender, age, experience, industry, and other factors, people have expectations of technology, and from…
M ≥ 7.0 earthquake recurrence on the San Andreas fault from a stress renewal model
Parsons, Thomas E.
2006-01-01
Forecasting M ≥ 7.0 San Andreas fault earthquakes requires an assessment of their expected frequency. I used a three-dimensional finite element model of California to calculate volumetric static stress drops from scenario M ≥ 7.0 earthquakes on three San Andreas fault sections. The ratio of stress drop to tectonic stressing rate derived from geodetic displacements yielded recovery times at points throughout the model volume. Under a renewal model, stress recovery times on ruptured fault planes can be a proxy for earthquake recurrence. I show curves of magnitude versus stress recovery time for three San Andreas fault sections. When stress recovery times were converted to expected M ≥ 7.0 earthquake frequencies, they fit Gutenberg-Richter relationships well matched to observed regional rates of M ≤ 6.0 earthquakes. Thus a stress-balanced model permits large earthquake Gutenberg-Richter behavior on an individual fault segment, though it does not require it. Modeled slip magnitudes and their expected frequencies were consistent with those observed at the Wrightwood paleoseismic site if strict time predictability does not apply to the San Andreas fault.
21 CFR 820.180 - General requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... DEVICES QUALITY SYSTEM REGULATION Records § 820.180 General requirements. All records required by this... retained for a period of time equivalent to the design and expected life of the device, but in no case less.... This section does not apply to the reports required by § 820.20(c) Management review, § 820.22 Quality...
21 CFR 820.180 - General requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... DEVICES QUALITY SYSTEM REGULATION Records § 820.180 General requirements. All records required by this... retained for a period of time equivalent to the design and expected life of the device, but in no case less.... This section does not apply to the reports required by § 820.20(c) Management review, § 820.22 Quality...
21 CFR 820.180 - General requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... DEVICES QUALITY SYSTEM REGULATION Records § 820.180 General requirements. All records required by this... retained for a period of time equivalent to the design and expected life of the device, but in no case less.... This section does not apply to the reports required by § 820.20(c) Management review, § 820.22 Quality...
21 CFR 820.180 - General requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... DEVICES QUALITY SYSTEM REGULATION Records § 820.180 General requirements. All records required by this... retained for a period of time equivalent to the design and expected life of the device, but in no case less.... This section does not apply to the reports required by § 820.20(c) Management review, § 820.22 Quality...
21 CFR 820.180 - General requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... DEVICES QUALITY SYSTEM REGULATION Records § 820.180 General requirements. All records required by this... retained for a period of time equivalent to the design and expected life of the device, but in no case less.... This section does not apply to the reports required by § 820.20(c) Management review, § 820.22 Quality...
Solving Rational Expectations Models Using Excel
ERIC Educational Resources Information Center
Strulik, Holger
2004-01-01
Simple problems of discrete-time optimal control can be solved using a standard spreadsheet software. The employed-solution method of backward iteration is intuitively understandable, does not require any programming skills, and is easy to implement so that it is suitable for classroom exercises with rational-expectations models. The author…
Sharma, Jitendra; Sugihara, Hiroki; Katz, Yarden; Schummers, James; Tenenbaum, Joshua; Sur, Mriganka
2015-01-01
The brain uses attention and expectation as flexible devices for optimizing behavioral responses associated with expected but unpredictably timed events. The neural bases of attention and expectation are thought to engage higher cognitive loci; however, their influence at the level of primary visual cortex (V1) remains unknown. Here, we asked whether single-neuron responses in monkey V1 were influenced by an attention task of unpredictable duration. Monkeys covertly attended to a spot that remained unchanged for a fixed period and then abruptly disappeared at variable times, prompting a lever release for reward. We show that monkeys responded progressively faster and performed better as the trial duration increased. Neural responses also followed monkey's task engagement—there was an early, but short duration, response facilitation, followed by a late but sustained increase during the time monkeys expected the attention spot to disappear. This late attentional modulation was significantly and negatively correlated with the reaction time and was well explained by a modified hazard function. Such bimodal, time-dependent changes were, however, absent in a task that did not require explicit attentional engagement. Thus, V1 neurons carry reliable signals of attention and temporal expectation that correlate with predictable influences on monkeys' behavioral responses. PMID:24836689
A requirement for memory retrieval during and after long-term extinction learning
Ouyang, Ming; Thomas, Steven A.
2005-01-01
Current learning theories are based on the idea that learning is driven by the difference between expectations and experience (the delta rule). In extinction, one learns that certain expectations no longer apply. Here, we test the potential validity of the delta rule by manipulating memory retrieval (and thus expectations) during extinction learning. Adrenergic signaling is critical for the time-limited retrieval (but not acquisition or consolidation) of contextual fear. Using genetic and pharmacologic approaches to manipulate adrenergic signaling, we find that long-term extinction requires memory retrieval but not conditioned responding. Identical manipulations of the adrenergic system that do not affect memory retrieval do not alter extinction. The results provide substantial support for the delta rule of learning theory. In addition, the timing over which extinction is sensitive to adrenergic manipulation suggests a model whereby memory retrieval occurs during, and several hours after, extinction learning to consolidate long-term extinction memory. PMID:15947076
Sharma, Jitendra; Sugihara, Hiroki; Katz, Yarden; Schummers, James; Tenenbaum, Joshua; Sur, Mriganka
2015-09-01
The brain uses attention and expectation as flexible devices for optimizing behavioral responses associated with expected but unpredictably timed events. The neural bases of attention and expectation are thought to engage higher cognitive loci; however, their influence at the level of primary visual cortex (V1) remains unknown. Here, we asked whether single-neuron responses in monkey V1 were influenced by an attention task of unpredictable duration. Monkeys covertly attended to a spot that remained unchanged for a fixed period and then abruptly disappeared at variable times, prompting a lever release for reward. We show that monkeys responded progressively faster and performed better as the trial duration increased. Neural responses also followed monkey's task engagement-there was an early, but short duration, response facilitation, followed by a late but sustained increase during the time monkeys expected the attention spot to disappear. This late attentional modulation was significantly and negatively correlated with the reaction time and was well explained by a modified hazard function. Such bimodal, time-dependent changes were, however, absent in a task that did not require explicit attentional engagement. Thus, V1 neurons carry reliable signals of attention and temporal expectation that correlate with predictable influences on monkeys' behavioral responses. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tactical Miniature Crystal Oscillator.
1980-08-01
manufactured by this process are expected to require 30 days to achieve minimum aging rates. (4) FUNDEMENTAL CRYSTAL RETRACE MEASUREMENT. An important crystal...considerable measurement time to detect differences and characterize components. Before investing considerable time in a candidate reactive element, a
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-21
...The EPA is proposing to amend specific provisions of the Greenhouse Gas Reporting Rule to provide greater clarity and flexibility to facilities subject to reporting emissions from certain source categories. These source categories will report greenhouse gas (GHG) data for the first time in September of 2012. The proposed changes are not expected to significantly change the overall calculation and monitoring requirements of the Greenhouse Gas Reporting Rule or add additional requirements for reporters, but are expected to correct errors and clarify existing requirements in order to facilitate accurate and timely reporting. The EPA is also proposing confidentiality determinations for four new data elements for the fluorinated gas production source category of the Greenhouse Gas Reporting Rule. Lastly, we are proposing an amendment to Table A-7 of the general provisions to add a data element used as an input to an emission equation in the fluorinated gas production source category.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Returns Or Statements § 1.6011-4 Requirement of statement disclosing participation in certain transactions... who is required to file a tax return must file within the time prescribed in paragraph (e) of this... transaction includes all of the factual elements relevant to the expected tax treatment of any investment...
ERIC Educational Resources Information Center
O'Brien-Moran, Michael; Soiferman, L. Karen
2010-01-01
This study involved a one-time survey of first-year undergraduate students at a Canadian University to determine their expectations when beginning a writing intensive course (i.e., the so-called "W" course, which is required of all first-year undergraduates at the University of Manitoba.) In this study, we focused on the University's…
Endometrial ablation: normal appearance and complications.
Drylewicz, Monica R; Robinson, Kathryn; Siegel, Cary Lynn
2018-03-14
Global endometrial ablation is a commonly performed, minimally invasive technique aimed at improving/resolving abnormal uterine bleeding and menorrhagia in women. As non-resectoscopic techniques have come into existence, endometrial ablation performance continues to increase due to accessibility and decreased requirements for operating room time and advanced technical training. The increased utilization of this method translates into increased imaging of patients who have undergone the procedure. An understanding of the expected imaging appearances of endometrial ablation using different modalities is important for the abdominal radiologist. In addition, the frequent usage of the technique naturally comes with complications requiring appropriate imaging work-up. We review the expected appearance of the post-endometrial ablated uterus on multiple imaging modalities and demonstrate the more common and rare complications seen in the immediate post-procedural time period and remotely.
ERIC Educational Resources Information Center
Skilbeck, Malcolm; Connell, Helen
2004-01-01
During the next decade, the teaching profession in Australia will be transformed. Due mainly to age related retirements there will be a massive turnover and a huge influx of new entrants. At the same time, it can be expected that there will be more exacting requirements and expectations of teachers as new professional standards are set to meet the…
Technology Directions for the 21st Century. Volume 4
NASA Technical Reports Server (NTRS)
Crimi, Giles; Verheggen, Henry; Botta, Robert; Paul, Heywood; Vuong, Xuyen
1998-01-01
Data compression is an important tool for reducing the bandwidth of communications systems, and thus for reducing the size, weight, and power of spacecraft systems. For data requiring lossless transmissions, including most science data from spacecraft sensors, small compression factors of two to three may be expected. Little improvement can be expected over time. For data that is suitable for lossy compression, such as video data streams, much higher compression factors can be expected, such as 100 or more. More progress can be expected in this branch of the field, since there is more hidden redundancy and many more ways to exploit that redundancy.
Code of Federal Regulations, 2010 CFR
2010-01-01
... narrative shall address the overall approach, time periods, and expected internal and external uses of the forecast. Examples of internal uses include providing information for developing or monitoring demand side... suppliers. Examples of external uses include meeting state and Federal regulatory requirements, obtaining...
Goyette, Kimberly A
2008-06-01
The educational expectations of 10th-graders have dramatically increased from 1980 to 2002. Their rise is attributable in part to the changing educational composition of students' parents and related to the educational profiles of their expected occupations. Students whose parents have gone to college are more likely to attend college themselves, and students expect occupations that are more prestigious in 2002 than in 1980. The educational requirements of particular occupation categories have risen only slightly. These analyses also reveal that educational expectations in recent cohorts are more loosely linked to social background and occupational plans than they were in 1980. The declining importance of parents' background and the decoupling of educational and occupational plans, in addition to a strong and significant effect of cohort on educational expectations, suggest that the expectation of four-year college attainment is indeed becoming the norm.
Yang, Brian W; Iorio, Matthew L; Day, Charles S
2017-03-15
The 2 main routes of medical device approval through the U.S. Food and Drug Administration are the premarket approval (PMA) process, which requires clinical trials, and the 510(k) premarket notification, which exempts devices from clinical trials if they are substantially equivalent to an existing device. Recently, there has been growing concern regarding the safety of devices approved through the 510(k) premarket notification. The PMA process decreases the potential for device recall; however, it is substantially more costly and time-consuming. Investors and medical device companies are only willing to invest in devices if they can expect to recoup their investment within a timeline of roughly 7 years. Our study utilizes financial modeling to assess the financial feasibility of approving various orthopaedic medical devices through the 510(k) and PMA processes. The expected time to recoup investment through the 510(k) process ranged from 0.585 years to 7.715 years, with an average time of 2.4 years; the expected time to recoup investment through the PMA route ranged from 2.9 years to 24.5 years, with an average time of 8.5 years. Six of the 13 orthopaedic device systems that we analyzed would require longer than our 7-year benchmark to recoup the investment costs of the PMA process. With the 510(k) premarket notification, only 1 device system would take longer than 7 years to recoup its investment costs. Although the 510(k) premarket notification has demonstrated safety concerns, broad requirements for PMA authorization may limit device innovation for less-prevalent orthopaedic conditions. As a result, new approval frameworks may be beneficial. Our report demonstrates how current regulatory policies can potentially influence orthopaedic device innovation.
Faculty approaches to combating professional burnout.
Neidle, E A
1984-02-01
The peculiar stresses of the dental educator make him or her a prime candidate for burnout and at the same time offer rather special protection against this phenomenon. The dental teacher, especially the clinical teacher, is required to spend virtually all of his time in intimate contact with students, whom he instructs, and with patients in the clinic, for whom he has responsibility. In addition, this same dental educator will probably have some kind of private practice. He will also be required, if he expects to advance in academic rank, to do research, to be cognizant of the latest developments in his field, to publish, to give presentations to his peers and to the community. This adds up to a lot that is expected of one person. Many people have expectations of him, many people crowd in on him with their demands. The situation sounds ideal for burnout. Yet, I believe that if the dental educator does what is expected, if he laces this diet of teaching and patient contact with research and library work, if he sets aside time (you may ask where he is to find it) for contemplation, for good works in the community, for hobbies, for reading, for cultural activities, then in fact the chance of burnout seems lower. And finally, if the dental educator pursues the possibilities that exist for leaves, for time away, for refreshment of his career by new contacts, new ideas, new ways of doing things, and new commitments, he will push away and hold at bay the dangers of burnout.
Extended space expectation values in quantum dynamical system evolutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demiralp, Metin
2014-10-06
The time variant power series expansion for the expectation value of a given quantum dynamical operator is well-known and well-investigated issue in quantum dynamics. However, depending on the operator and Hamiltonian singularities this expansion either may not exist or may not converge for all time instances except the beginning of the evolution. This work focuses on this issue and seeks certain cures for the negativities. We work in the extended space obtained by adding all images of the initial wave function under the system Hamiltonian’s positive integer powers. This requires the introduction of certain appropriately defined weight operators. The resultingmore » better convergence in the temporal power series urges us to call the new defined entities “extended space expectation values” even though they are constructed over certain weight operators and are somehow pseudo expectation values.« less
Measuring fecundity with standardised estimates of expected pregnancies.
Mikolajczyk, Rafael T; Stanford, Joseph B
2006-11-01
Approaches to measuring fecundity include the assessment of time to pregnancy and day-specific probabilities of conception (daily fecundities) indexed to a day of ovulation. In this paper, we develop an additional approach of calculating expected pregnancies based on daily fecundities indexed to the last day of the menstrual cycle. Expected pregnancies can thus be calculated while controlling for frequency and timing of coitus. Comparing observed pregnancies with expected pregnancies allows for a standardised comparison of fecundity between studies or groups within studies, and can be used to assess the effects of categorical covariates on the woman or couple level, and also on the cycle level. This can be accomplished in a minimal data set that does not necessarily require hormonal measurement or the explicit identification of ovulation. We demonstrate this approach by examining the effects of age and parity on fecundity in a data set from women monitoring their fertility cycles with the Creighton Model FertilityCare System.
Designing a multi-petabyte database for LSST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becla, J; Hanushevsky, A
2005-12-21
The 3.2 giga-pixel LSST camera will produce over half a petabyte of raw images every month. This data needs to be reduced in under a minute to produce real-time transient alerts, and then cataloged and indexed to allow efficient access and simplify further analysis. The indexed catalogs alone are expected to grow at a speed of about 600 terabytes per year. The sheer volume of data, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require cutting-edge techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on amore » database for catalogs and metadata. Several database systems are being evaluated to understand how they will scale and perform at these data volumes in anticipated LSST access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, and the database architecture that is expected to be adopted in order to meet the data challenges.« less
Light-weight Parallel Python Tools for Earth System Modeling Workflows
NASA Astrophysics Data System (ADS)
Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.
2015-12-01
With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.
47 CFR 74.1263 - Time of operation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... FM Broadcast Booster Stations § 74.1263 Time of operation. (a) The licensee of an FM translator or booster station is not required to adhere to any regular schedule of operation. However, the licensee of an FM translator or booster station is expected to provide a dependable service to the extent that...
47 CFR 74.1263 - Time of operation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... FM Broadcast Booster Stations § 74.1263 Time of operation. (a) The licensee of an FM translator or booster station is not required to adhere to any regular schedule of operation. However, the licensee of an FM translator or booster station is expected to provide a dependable service to the extent that...
47 CFR 74.1263 - Time of operation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... FM Broadcast Booster Stations § 74.1263 Time of operation. (a) The licensee of an FM translator or booster station is not required to adhere to any regular schedule of operation. However, the licensee of an FM translator or booster station is expected to provide a dependable service to the extent that...
47 CFR 74.1263 - Time of operation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... FM Broadcast Booster Stations § 74.1263 Time of operation. (a) The licensee of an FM translator or booster station is not required to adhere to any regular schedule of operation. However, the licensee of an FM translator or booster station is expected to provide a dependable service to the extent that...
34 CFR 350.5 - What definitions apply?
Code of Federal Regulations, 2011 CFR
2011-07-01
... expected to require multiple vocational rehabilitation services over an extended period of time; and (iii..., hemiplegia, hemophilia, respiratory or pulmonary dysfunction, mental retardation, mental illness, multiple sclerosis, muscular dystrophy, musculoskeletal disorders, neurological disorders (including stroke and...
34 CFR 350.5 - What definitions apply?
Code of Federal Regulations, 2012 CFR
2012-07-01
... expected to require multiple vocational rehabilitation services over an extended period of time; and (iii..., hemiplegia, hemophilia, respiratory or pulmonary dysfunction, mental retardation, mental illness, multiple sclerosis, muscular dystrophy, musculoskeletal disorders, neurological disorders (including stroke and...
Positioning navigation and timing service applications in cyber physical systems
NASA Astrophysics Data System (ADS)
Qu, Yi; Wu, Xiaojing; Zeng, Lingchuan
2017-10-01
The positioning navigation and timing (PNT) architecture was discussed in detail, whose history, evolvement, current status and future plan were presented, main technologies were listed, advantages and limitations of most technologies were compared, novel approaches were introduced, and future capacities were sketched. The concept of cyber-physical system (CPS) was described and their primary features were interpreted. Then the three-layer architecture of CPS was illustrated. Next CPS requirements on PNT services were analyzed, including requirements on position reference and time reference, requirements on temporal-spatial error monitor, requirements on dynamic services, real-time services, autonomous services, security services and standard services. Finally challenges faced by PNT applications in CPS were concluded. The conclusion was expected to facilitate PNT applications in CPS, and furthermore to provide references to the design and implementation of both architectures.
Loss Control and Collimation for the LHC
NASA Astrophysics Data System (ADS)
Burkhardt, H.
2005-06-01
The total energy stored in the LHC is expected to reach 360 Mega Joule, which is about two orders of magnitude higher than in HERA or the Tevatron. Damage and quench protection in the LHC require a highly efficient and at the same time very robust collimation system. The currently planned system, the status of the project and the expected performance of the collimation system from injection up to operation with colliding beams will be presented.
Patient expectations from an emergency medical service.
Qidwai, Waris; Ali, Syed Sohail; Baqir, Muhammad; Ayub, Semi
2005-01-01
Patient expectation survey at the Emergency Medical Services can improve patient satisfaction. A need was established to conduct such a survey in order to recommend its use as a quality improvement tool. The study was conducted on patients visiting the Emergency Medical Services, Aga Khan University, Karachi. A questionnaire was used to collect information on the demographic profile, and expectations of patients. The ethical requirements for conducting the study were met. A hundred patients were surveyed. The majority was relatively young, married men and women, well educated and better socio-economically placed. The majority of the patients expected a waiting time and a consultation time of less than 30 minutes and 20 minutes, respectively. The majority of respondents expected and agreed to be examined by a trainee but there were reluctant to be examined by the students. There was an expectation that the consultant will examine patients and not advice the attending team over the phone. The majority of the patients expected intravenous fluid therapy. There was a desire to have patient attendant present during the consultation process. The majority of the patients expected to pay less than three thousand rupees for the visit. An expectation exists for investigations and hospitalization. Involvement of patients in decisions concerning their treatment and written feedback on their visit was expected. We have documented the need and value of patient expectation survey at the Emergency Medical Services department. The use of such a tool is recommended in order to improve the satisfaction levels of patients visiting such facilities.
Code of Federal Regulations, 2010 CFR
2010-10-01
... COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A ANADROMOUS FISHERIES CONSERVATION... in this section. (a) Secretary. The Secretary of Commerce, the Secretary of the Interior, or their..., expected results and benefits, approach, cost, location and time required for completion. (i) Project...
Pater, Mackenzie L; Rosenblatt, Noah J; Grabiner, Mark D
2015-01-01
Tripping during locomotion, the leading cause of falls in older adults, generally occurs without prior warning and often while performing a secondary task. Prior warning can alter the state of physiological preparedness and beneficially influence the response to the perturbation. Previous studies have examined how altering the initial "preparedness" for an upcoming perturbation can affect kinematic responses following small disturbances that did not require a stepping response to restore dynamic stability. The purpose of this study was to examine how expectation affected fall outcome and recovery response kinematics following a large, treadmill-delivered perturbation simulating a trip and requiring at least one recovery step to avoid a fall. Following the perturbation, 47% of subjects fell when they were not expecting the perturbation whereas 12% fell when they were aware that the perturbation would occur "sometime in the next minute". The between-group differences were accompanied by slower reaction times in the non-expecting group (p < 0.01). Slower reaction times were associated with kinematics that have previously been shown to increase the likelihood of falling following a laboratory-induced trip. The results demonstrate the importance of considering the context under which recovery responses are assessed, and further, gives insight to the context during which task-specific perturbation training is administered. Copyright © 2014 Elsevier B.V. All rights reserved.
Embracing E-Books: Increasing Students' Motivation to Read and Write
ERIC Educational Resources Information Center
Siegle, Del
2012-01-01
In his keynote address at the "New York Times" Schools for Tomorrow 2011 Fall Conference, Dr. Larry Summers (2011) suggested that technology implementations have an unusual growth pattern. New technology innovations usually require more time to "catch on" than one might expect, but once they catch on, their use spreads more quickly than anyone can…
Creative self-efficacy development and creative performance over time.
Tierney, Pamela; Farmer, Steven M
2011-03-01
Building from an established framework of self-efficacy development, this study provides a longitudinal examination of the development of creative self-efficacy in an ongoing work context. Results show that increases in employee creative role identity and perceived creative expectation from supervisors over a 6-month time period were associated with enhanced sense of employee capacity for creative work. Contrary to what was expected, employees who experienced increased requirements for creativity in their jobs actually reported a decreased sense of efficaciousness for creative work. Results show that increases in creative self-efficacy corresponded with increases in creative performance as well. PsycINFO Database Record (c) 2011 APA, all rights reserved.
Overprotection and lowered expectations of persons with disabilities: the unforeseen consequences.
Sanders, Karen Y
2006-01-01
Lowered expectations and overprotection of the individual with a disability can cause lowered self esteem which can result in a life time of underachievement and failure to reach their full potential. Both lowered expectations and overprotection are forms of discrimination. Internalization of discrimination causes the person with a disability to believe that they are less capable than a person without a disability. Parents and care providers of children with disabilities may overprotect the child to shield them from harm; however this can actually cause more damage. Successful parenting skills are required to help children and adolescents develop a positive self concept and high self esteem. Guidelines have been developed to assist parents, educators and other professionals regarding the effects of overprotection and lowered expectations.
Enabling fast charging - Vehicle considerations
NASA Astrophysics Data System (ADS)
Meintz, Andrew; Zhang, Jiucai; Vijayagopal, Ram; Kreutzer, Cory; Ahmed, Shabbir; Bloom, Ira; Burnham, Andrew; Carlson, Richard B.; Dias, Fernando; Dufek, Eric J.; Francfort, James; Hardy, Keith; Jansen, Andrew N.; Keyser, Matthew; Markel, Anthony; Michelbacher, Christopher; Mohanpurkar, Manish; Pesaran, Ahmad; Scoffield, Don; Shirk, Matthew; Stephens, Thomas; Tanim, Tanvir
2017-11-01
To achieve a successful increase in the plug-in battery electric vehicle (BEV) market, it is anticipated that a significant improvement in battery performance is required to increase the range that BEVs can travel and the rate at which they can be recharged. While the range that BEVs can travel on a single recharge is improving, the recharge rate is still much slower than the refueling rate of conventional internal combustion engine vehicles. To achieve comparable recharge times, we explore the vehicle considerations of charge rates of at least 400 kW. Faster recharge is expected to significantly mitigate the perceived deficiencies for long-distance transportation, to provide alternative charging in densely populated areas where overnight charging at home may not be possible, and to reduce range anxiety for travel within a city when unplanned charging may be required. This substantial increase in charging rate is expected to create technical issues in the design of the battery system and the vehicle's electrical architecture that must be resolved. This work focuses on vehicle system design and total recharge time to meet the goals of implementing improved charge rates and the impacts of these expected increases on system voltage and vehicle components.
Enabling fast charging – Vehicle considerations
Meintz, Andrew; Zhang, Jiucai; Vijayagopal, Ram; ...
2017-11-01
To achieve a successful increase in the plug-in battery electric vehicle (BEV) market it is anticipated that a significant improvement in battery performance is required to improve the range that BEVs can travel. While the range that BEVs can travel on a single recharge is improving, the rate at which these vehicles can be recharged is still much slower than conventional internal combustion engine vehicles. To achieve comparable recharge times we explore the vehicle considerations of charge rates up to 350 kW. This faster recharge is expected to significantly mitigate the perceived deficiencies for long-distance transportation, to provide alternative chargingmore » in densely populated areas where overnight charging at home may not be possible, and to reduce range anxiety for travel within a city when unplanned charging maybe required. This substantial increase in the charging rate is expected to create technical issues in the design of the battery system and the vehicle electrical architecture that must be resolved. This work will focus on the battery system thermal design and total recharge time to meet the goals of implementing higher charge rates as well as the impacts of the expected increase in system voltage on the components of the vehicle.« less
Enabling fast charging – Vehicle considerations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meintz, Andrew; Zhang, Jiucai; Vijayagopal, Ram
To achieve a successful increase in the plug-in battery electric vehicle (BEV) market it is anticipated that a significant improvement in battery performance is required to improve the range that BEVs can travel. While the range that BEVs can travel on a single recharge is improving, the rate at which these vehicles can be recharged is still much slower than conventional internal combustion engine vehicles. To achieve comparable recharge times we explore the vehicle considerations of charge rates up to 350 kW. This faster recharge is expected to significantly mitigate the perceived deficiencies for long-distance transportation, to provide alternative chargingmore » in densely populated areas where overnight charging at home may not be possible, and to reduce range anxiety for travel within a city when unplanned charging maybe required. This substantial increase in the charging rate is expected to create technical issues in the design of the battery system and the vehicle electrical architecture that must be resolved. This work will focus on the battery system thermal design and total recharge time to meet the goals of implementing higher charge rates as well as the impacts of the expected increase in system voltage on the components of the vehicle.« less
DOT National Transportation Integrated Search
2017-06-01
This project developed a methodology to simulate and analyze roadway traffic patterns : and expected penetration and timing of electric vehicles (EVs) with application directed : toward the requirements for electric vehicle supply equipment (EVSE) si...
Using Sorties vs. Flying Hours to Predict Aircraft Spares Demand
1997-04-01
the war plans, the demand for air- craft spares was substantially less than expected. This expected demand was based on the standard U.S. Air Force...by some combination of them. The Air Force’s new war plans for tactical aircraft in the 1993 USAF War and Mobilization Plan, Volume 5 (WMP-5) have...to continue to use flying hours as the basis for predicting wartime demand from peacetime experience, the cost of the war - time spares requirement
Measuring the success of electronic medical record implementation using electronic and survey data.
Keshavjee, K.; Troyan, S.; Holbrook, A. M.; VanderMolen, D.
2001-01-01
Computerization of physician practices is increasing. Stakeholders are demanding demonstrated value for their Electronic Medical Record (EMR) implementations. We developed survey tools to measure medical office processes, including administrative and physician tasks pre- and post-EMR implementation. We included variables that were expected to improve with EMR implementation and those that were not expected to improve, as controls. We measured the same processes pre-EMR, at six months and 18 months post-EMR. Time required for most administrative tasks decreased within six months of EMR implementation. Staff time spent on charting increased with time, in keeping with our anecdotal observations that nurses were given more responsibility for charting in many offices. Physician time to chart increased initially by 50%, but went down to original levels by 18 months. However, this may be due to the drop-out of those physicians who had a difficult time charting electronically. PMID:11825201
Medical Grade Water Generation for Intravenous Fluid Production on Exploration Missions
NASA Technical Reports Server (NTRS)
Niederhaus, Charles E.; Barlow, Karen L.; Griffin, DeVon W.; Miller, Fletcher J.
2008-01-01
This document describes the intravenous (IV) fluids requirements for medical care during NASA s future Exploration class missions. It further discusses potential methods for generating such fluids and the challenges associated with different fluid generation technologies. The current Exploration baseline mission profiles are introduced, potential medical conditions described and evaluated for fluidic needs, and operational issues assessed. Conclusions on the fluid volume requirements are presented, and the feasibility of various fluid generation options are discussed. A separate report will document a more complete trade study on the options to provide the required fluids.At the time this document was developed, NASA had not yet determined requirements for medical care during Exploration missions. As a result, this study was based on the current requirements for care onboard the International Space Station (ISS). While we expect that medical requirements will be different for Exploration missions, this document will provide a useful baseline for not only developing hardware to generate medical water for injection (WFI), but as a foundation for meeting future requirements. As a final note, we expect WFI requirements for Exploration will be higher than for ISS care, and system capacity may well need to be higher than currently specified.
Zhang, Hang; Wu, Shih-Wei; Maloney, Laurence T.
2010-01-01
S.-W. Wu, M. F. Dal Martello, and L. T. Maloney (2009) evaluated subjects' performance in a visuo-motor task where subjects were asked to hit two targets in sequence within a fixed time limit. Hitting targets earned rewards and Wu et al. varied rewards associated with targets. They found that subjects failed to maximize expected gain; they failed to invest more time in the movement to the more valuable target. What could explain this lack of response to reward? We first considered the possibility that subjects require training in allocating time between two movements. In Experiment 1, we found that, after extensive training, subjects still failed: They did not vary time allocation with changes in payoff. However, their actual gains equaled or exceeded the expected gain of an ideal time allocator, indicating that constraining time itself has a cost for motor accuracy. In a second experiment, we found that movements made under externally imposed time limits were less accurate than movements made with the same timing freely selected by the mover. Constrained time allocation cost about 17% in expected gain. These results suggest that there is no single speed–accuracy tradeoff for movement in our task and that subjects pursued different motor strategies with distinct speed–accuracy tradeoffs in different conditions. PMID:20884550
Essays on Causal Inference for Public Policy
ERIC Educational Resources Information Center
Zajonc, Tristan
2012-01-01
Effective policymaking requires understanding the causal effects of competing proposals. Relevant causal quantities include proposals' expected effect on different groups of recipients, the impact of policies over time, the potential trade-offs between competing objectives, and, ultimately, the optimal policy. This dissertation studies causal…
ERIC Educational Resources Information Center
Turk, Laraine D.
"Ancient Egypt," an upper-division, non-required history course covering Egypt from pre-dynastic time through the Roman domination is described. General descriptive information is presented first, including the method of grading, expectation of student success rate, long-range course objectives, procedures for revising the course, major…
32 CFR 651.49 - Preliminary phase.
Code of Federal Regulations, 2010 CFR
2010-07-01
... data, including required studies. (3) Preparation of draft and final EISs (DEISs and FEISs), and... relationship between the timing of the preparation of environmental analyses and the tentative planning and..., preparation of a general expected schedule for future specific implementing (tiered) actions that will involve...
32 CFR 651.49 - Preliminary phase.
Code of Federal Regulations, 2011 CFR
2011-07-01
... data, including required studies. (3) Preparation of draft and final EISs (DEISs and FEISs), and... relationship between the timing of the preparation of environmental analyses and the tentative planning and..., preparation of a general expected schedule for future specific implementing (tiered) actions that will involve...
... for the decision-making process required when it’s time to leave the hospital. What to Expect in Stroke Rehab Following a stroke, about two-thirds of survivors receive some type rehabilitation. In this second of our two-art series, we want to alleviate some of the mystery, ...
PEM-PCA: a parallel expectation-maximization PCA face recognition architecture.
Rujirakul, Kanokmon; So-In, Chakchai; Arnonkijpanich, Banchar
2014-01-01
Principal component analysis or PCA has been traditionally used as one of the feature extraction techniques in face recognition systems yielding high accuracy when requiring a small number of features. However, the covariance matrix and eigenvalue decomposition stages cause high computational complexity, especially for a large database. Thus, this research presents an alternative approach utilizing an Expectation-Maximization algorithm to reduce the determinant matrix manipulation resulting in the reduction of the stages' complexity. To improve the computational time, a novel parallel architecture was employed to utilize the benefits of parallelization of matrix computation during feature extraction and classification stages including parallel preprocessing, and their combinations, so-called a Parallel Expectation-Maximization PCA architecture. Comparing to a traditional PCA and its derivatives, the results indicate lower complexity with an insignificant difference in recognition precision leading to high speed face recognition systems, that is, the speed-up over nine and three times over PCA and Parallel PCA.
Expected net present value of sample information: from burden to investment.
Hall, Peter S; Edlin, Richard; Kharroubi, Samer; Gregory, Walter; McCabe, Christopher
2012-01-01
The Expected Value of Information Framework has been proposed as a method for identifying when health care technologies should be immediately reimbursed and when any reimbursement should be withheld while awaiting more evidence. This framework assesses the value of obtaining additional evidence to inform a current reimbursement decision. This represents the burden of not having the additional evidence at the time of the decision. However, when deciding whether to reimburse now or await more evidence, decision makers need to know the value of investing in more research to inform a future decision. Assessing this value requires consideration of research costs, research time, and what happens to patients while the research is undertaken and after completion. The investigators describe a development of the calculation of the expected value of sample information that assesses the value of investing in further research, including an only-in-research strategy and an only-with-research strategy.
Fairchild, Paige C; Nathan, Aviva G; Quinn, Michael; Huang, Elbert S; Laiteerapong, Neda
2017-01-01
Diabetes and hypertension are chronic conditions for which over 90 % of patients require medication regimens that must be intensified over time. However, delays in intensification are common, and may be partially due to unrealistic patient expectations. To explore whether patient expectations regarding their diabetes and hypertension are congruent with the natural history of these conditions. Qualitative analysis of semi-structured interviews. Sixty adults from an urban academic primary care clinic taking oral medications for both diabetes (duration <10 years) and hypertension (any duration) MAIN MEASURES: (1) Expectations for their a) current diabetes and hypertension medications, b) need for additional medications, c) likelihood of cure (not requiring medications); (2) preferences for receiving information on expected duration of treatments KEY RESULTS: The average patient age was 60 years, and 65 % were women. Nearly half (48 %) of participants expected to discontinue current diabetes medications in 6 years or less, whereas only one-fifth (22 %) expected to take medications for life. For blood pressure medications, one-third (37 %) expected to stop medicines in 6 years or less, and one-third expected to take medicines for life. The vast majority did not expect that they would need additional medications in the future (oral diabetes medications: 85 %; insulin: 87 %; hypertension medications: 93 %). A majority expected that their diabetes (65 %) and hypertension (58 %) would be cured. Most participants believed that intensifying lifestyle changes would allow them to discontinue medications, avoid additional medications, or cure their diabetes and hypertension. Nearly all participants (97 %) wanted to hear information on the expected duration of their diabetes and hypertension treatments from their healthcare provider. Providers should educate patients on the natural history of diabetes and hypertension in order to manage patient expectations for current and future medications. Future research should assess whether education can increase the adoption of and adherence to medications, without diminishing enthusiasm for lifestyle changes.
Dreier, Adina; Rogalski, Hagen; Homeyer, Sabine; Oppermann, Roman Frank; Hingst, Peter; Hoffmann, Wolfgang
2015-10-01
The aging population causes a sustained increase in demand of medical and nursing care services. At the same time health care professionals are aging, too. This leads to a growing number of health care gaps. Therefore, the health care system needs to be reformed. This includes a reallocation of task between some of the health care professions. This article addresses developments, potentials and limitations in the context of the future allocation of tasks between the nursing and the medical profession. Aim is to specify the future task sharing between nurses and physicians regarding expectations, requirements and limitations. We conducted questionnaire based Delphi interviews with an interdisciplinary group of experts. (type aggregation of ideas). In the future, to expert’s point of view, nurses will take over routine tasks in the medical and nursing health care supply. Task sharing by substitution is regarded with skepticism by experts. It requires a long time perspective and an early involvement of all stakeholders. Germany is at the beginning of the process of the future task sharing between nurses and physicians. The realization requires a comprehensive political support and further development of concepts including scientific implementation and evaluation.
Evaluation and application of a fast module in a PLC based interlock and control system
NASA Astrophysics Data System (ADS)
Zaera-Sanz, M.
2009-08-01
The LHC Beam Interlock system requires a controller performing a simple matrix function to collect the different beam dump requests. To satisfy the expected safety level of the Interlock, the system should be robust and reliable. The PLC is a promising candidate to fulfil both aspects but too slow to meet the expected response time which is of the order of μseconds. Siemens has introduced a ``so called'' fast module (FM352-5 Boolean Processor). It provides independent and extremely fast control of a process within a larger control system using an onboard processor, a Field Programmable Gate Array (FPGA), to execute code in parallel which results in extremely fast scan times. It is interesting to investigate its features and to evaluate it as a possible candidate for the beam interlock system. This paper publishes the results of this study. As well, this paper could be useful for other applications requiring fast processing using a PLC.
Eddy, Sean R.
2008-01-01
Sequence database searches require accurate estimation of the statistical significance of scores. Optimal local sequence alignment scores follow Gumbel distributions, but determining an important parameter of the distribution (λ) requires time-consuming computational simulation. Moreover, optimal alignment scores are less powerful than probabilistic scores that integrate over alignment uncertainty (“Forward” scores), but the expected distribution of Forward scores remains unknown. Here, I conjecture that both expected score distributions have simple, predictable forms when full probabilistic modeling methods are used. For a probabilistic model of local sequence alignment, optimal alignment bit scores (“Viterbi” scores) are Gumbel-distributed with constant λ = log 2, and the high scoring tail of Forward scores is exponential with the same constant λ. Simulation studies support these conjectures over a wide range of profile/sequence comparisons, using 9,318 profile-hidden Markov models from the Pfam database. This enables efficient and accurate determination of expectation values (E-values) for both Viterbi and Forward scores for probabilistic local alignments. PMID:18516236
ERIC Educational Resources Information Center
Khowaja, Hina Amin
2017-01-01
This paper is an attempt to highlight the ways through which expected parents can support their children for their development from the time of conception till birth. This paper shares ideas to the parents about stimulation, which children are required for their development. This paper is divided into three categories: First trimester, second…
Listing triangles in expected linear time on a class of power law graphs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordman, Daniel J.; Wilson, Alyson G.; Phillips, Cynthia Ann
Enumerating triangles (3-cycles) in graphs is a kernel operation for social network analysis. For example, many community detection methods depend upon finding common neighbors of two related entities. We consider Cohen's simple and elegant solution for listing triangles: give each node a 'bucket.' Place each edge into the bucket of its endpoint of lowest degree, breaking ties consistently. Each node then checks each pair of edges in its bucket, testing for the adjacency that would complete that triangle. Cohen presents an informal argument that his algorithm should run well on real graphs. We formalize this argument by providing an analysismore » for the expected running time on a class of random graphs, including power law graphs. We consider a rigorously defined method for generating a random simple graph, the erased configuration model (ECM). In the ECM each node draws a degree independently from a marginal degree distribution, endpoints pair randomly, and we erase self loops and multiedges. If the marginal degree distribution has a finite second moment, it follows immediately that Cohen's algorithm runs in expected linear time. Furthermore, it can still run in expected linear time even when the degree distribution has such a heavy tail that the second moment is not finite. We prove that Cohen's algorithm runs in expected linear time when the marginal degree distribution has finite 4/3 moment and no vertex has degree larger than {radical}n. In fact we give the precise asymptotic value of the expected number of edge pairs per bucket. A finite 4/3 moment is required; if it is unbounded, then so is the number of pairs. The marginal degree distribution of a power law graph has bounded 4/3 moment when its exponent {alpha} is more than 7/3. Thus for this class of power law graphs, with degree at most {radical}n, Cohen's algorithm runs in expected linear time. This is precisely the value of {alpha} for which the clustering coefficient tends to zero asymptotically, and it is in the range that is relevant for the degree distribution of the World-Wide Web.« less
NASA Technical Reports Server (NTRS)
Harp, J. L., Jr.; Oatway, T. P.
1975-01-01
A research effort was conducted with the goal of reducing computer time of a Navier Stokes Computer Code for prediction of viscous flow fields about lifting bodies. A two-dimensional, time-dependent, laminar, transonic computer code (STOKES) was modified to incorporate a non-uniform timestep procedure. The non-uniform time-step requires updating of a zone only as often as required by its own stability criteria or that of its immediate neighbors. In the uniform timestep scheme each zone is updated as often as required by the least stable zone of the finite difference mesh. Because of less frequent update of program variables it was expected that the nonuniform timestep would result in a reduction of execution time by a factor of five to ten. Available funding was exhausted prior to successful demonstration of the benefits to be derived from the non-uniform time-step method.
Ruscio, Daniele; Ciceri, Maria Rita; Biassoni, Federica
2015-04-01
Brake Reaction Time (BRT) is an important parameter for road safety. Previous research has shown that drivers' expectations can impact RT when facing hazardous situations, but driving with advanced driver assistance systems, can change the way BRT are considered. The interaction with a collision warning system can help faster more efficient responses, but at the same time can require a monitoring task and evaluation process that may lead to automation complacency. The aims of the present study are to test in a real-life setting whether automation compliancy can be generated by a collision warning system and what component of expectancy can impact the different tasks involved in an assisted BRT process. More specifically four component of expectancy were investigated: presence/absence of anticipatory information, previous direct experience, reliability of the device, and predictability of the hazard determined by repeated use of the warning system. Results supply indication on perception time and mental elaboration of the collision warning system alerts. In particular reliable warning quickened the decision making process, misleading warnings generated automation complacency slowing visual search for hazard detection, lack of directed experienced slowed the overall response while unexpected failure of the device lead to inattentional blindness and potential pseudo-accidents with surprise obstacle intrusion. Copyright © 2015 Elsevier Ltd. All rights reserved.
Seed crop frequency in northeastern Wisconsin
Richard M. Godman; Gilbert A. Mattson
1992-01-01
Knowing the frequency of good seed crops is important in regenerating northern hardwood species, particularly those that require site preparation and special cutting methods. It is also desirable to know the maximum time that might be expected between poor crops to help schedule silvicultural treatment or supplemental seeding.
Strategic Planning Tools for Large-Scale Technology-Based Assessments
ERIC Educational Resources Information Center
Koomen, Marten; Zoanetti, Nathan
2018-01-01
Education systems are increasingly being called upon to implement new technology-based assessment systems that generate efficiencies, better meet changing stakeholder expectations, or fulfil new assessment purposes. These assessment systems require coordinated organisational effort to implement and can be expensive in time, skill and other…
Using Common Planning Time to Foster Professional Learning
ERIC Educational Resources Information Center
Dever, Robin; Lash, Martha J.
2013-01-01
Increased emphasis on meeting state standards, more stringent requirements for designation as highly qualified, and intensified accountability for student performance have foisted new expectations upon teachers and stimulated changes in professional development models in which the greater urgency is clearly to attend to the teacher's role as…
48 CFR 216.601 - Time-and-materials contracts.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-Hour Proposal Requirements—Non-Commercial Item Acquisition with Adequate Price Competition, with 252... contract type for non-commercial items if the price is expected to be based on adequate competition. [71 FR... of the contract or order; establishing fixed prices for portions of the requirement); and (D...
2008-06-01
key assumption in the calculation of the primary MIW MOEs of the estimated risk to a transitor and the expected time required to clear all of the mines...primary MOE of Risk, or Probability of Damage to a Ship Transitor , is calculated by using information in the highlighted circle on the left, to include...percent clearance achieved. 0 E( ) Pr( | , ) r R r r m p ∞ = = ∗∑ (0.2) Risk can be calculated for each transitor given the expected number of
Baor, Liora; Soskolne, Varda
2010-06-01
This study explores the differences in prenatal maternal expectations, coping resources and maternal stress between first time mothers of IVF twins and first time mothers of spontaneously conceived twins. The role of prenatal maternal expectations in the prediction of maternal stress was examined, as well as the mediating and moderating effect of coping resources on the association between pregnancy-type group and maternal stress. Mothers of twins from various regions in Israel were included in this prospective and cross-sectional study in which 88 mothers of IVF-conceived twins and 98 mothers of spontaneously conceived twins were interviewed twice. First, at 33-36 weeks of their pregnancy they completed a socio-demographic questionnaire and the maternal expectations questionnaire; then at 6 months after birth they completed a questionnaire regarding the delivery and medical condition of the infants, and their coping resources and maternal stress. Compared with mothers who conceived spontaneously, IVF mothers had more positive prenatal maternal expectations, but poorer coping resources and higher levels of maternal stress 6 months after birth. Maternal expectations had no predictive power regarding maternal stress, although the mother's coping resources were significantly related to maternal stress and mediated the association between pregnancy type and maternal stress. IVF-pregnant women bearing twins should be considered a high-risk group. Early identification of these mothers is essential for timely psychosocial interventions in order to enhance their resources and decrease maternal stress. Further longitudinal studies are required to determine causality in more ethnically-diverse mothers of twins.
Coping with a Breast Cancer Diagnosis
... the final decisions should be made together. Knowing what to expect is another way to feel in control. It may also help to keep as normal a routine as possible. Be patient. Coping with breast cancer requires time, acceptance, a fighting spirit and support. Many people also find strength in ...
46 CFR 151.50-13 - Propylene oxide.
Code of Federal Regulations, 2012 CFR
2012-10-01
...) Pressure vessel cargo tanks shall meet the requirements of Class II pressure vessels. (2) Cargo tanks shall be designed for the maximum pressure expected to be encountered during loading, storing and... cargo piping shall be subjected to a hydrostatic test of 11/2 times the maximum pressure to which they...
46 CFR 151.50-13 - Propylene oxide.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) Pressure vessel cargo tanks shall meet the requirements of Class II pressure vessels. (2) Cargo tanks shall be designed for the maximum pressure expected to be encountered during loading, storing and... cargo piping shall be subjected to a hydrostatic test of 11/2 times the maximum pressure to which they...
46 CFR 151.50-13 - Propylene oxide.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) Pressure vessel cargo tanks shall meet the requirements of Class II pressure vessels. (2) Cargo tanks shall be designed for the maximum pressure expected to be encountered during loading, storing and... cargo piping shall be subjected to a hydrostatic test of 11/2 times the maximum pressure to which they...
46 CFR 151.50-13 - Propylene oxide.
Code of Federal Regulations, 2013 CFR
2013-10-01
...) Pressure vessel cargo tanks shall meet the requirements of Class II pressure vessels. (2) Cargo tanks shall be designed for the maximum pressure expected to be encountered during loading, storing and... cargo piping shall be subjected to a hydrostatic test of 11/2 times the maximum pressure to which they...
46 CFR 151.50-13 - Propylene oxide.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) Pressure vessel cargo tanks shall meet the requirements of Class II pressure vessels. (2) Cargo tanks shall be designed for the maximum pressure expected to be encountered during loading, storing and... cargo piping shall be subjected to a hydrostatic test of 11/2 times the maximum pressure to which they...
Assessing and quantifying changes in precipitation patterns using event-driven analysis
USDA-ARS?s Scientific Manuscript database
Studies have claimed that climate change may adversely affect precipitation patterns by increasing the occurrence of extreme events. The effects of climate change on precipitation is expected to take place over a long period of time and will require long-term data to demonstrate. Frequency analysis ...
40 CFR 55.6 - Permit requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Administrator through the EPA Regional Office. (iii) The delegated agency shall send a copy of any preliminary... through the EPA Regional Office at the time of the determination and shall make available to the... or is expected in the future to cause or contribute to a violation of any applicable State or Federal...
40 CFR 55.6 - Permit requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Administrator through the EPA Regional Office. (iii) The delegated agency shall send a copy of any preliminary... through the EPA Regional Office at the time of the determination and shall make available to the... or is expected in the future to cause or contribute to a violation of any applicable State or Federal...
40 CFR 55.6 - Permit requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Administrator through the EPA Regional Office. (iii) The delegated agency shall send a copy of any preliminary... through the EPA Regional Office at the time of the determination and shall make available to the... or is expected in the future to cause or contribute to a violation of any applicable State or Federal...
40 CFR 55.6 - Permit requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Administrator through the EPA Regional Office. (iii) The delegated agency shall send a copy of any preliminary... through the EPA Regional Office at the time of the determination and shall make available to the... or is expected in the future to cause or contribute to a violation of any applicable State or Federal...
40 CFR 55.6 - Permit requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Administrator through the EPA Regional Office. (iii) The delegated agency shall send a copy of any preliminary... through the EPA Regional Office at the time of the determination and shall make available to the... or is expected in the future to cause or contribute to a violation of any applicable State or Federal...
Alcohol and Staff Leisure Time.
ERIC Educational Resources Information Center
Camping Magazine, 1992
1992-01-01
Discusses the problem of alcohol use and abuse by camp staff. Describes alcohol policies of two different camps. Camp Highlands allows responsible drinking but not intoxication. Camp Olympia requires total abstinence from alcohol. A policy that clearly expresses the camp's philosophy toward alcohol and spells out all expectations and results is…
36 CFR 1600.3 - Requests for records.
Code of Federal Regulations, 2011 CFR
2011-07-01
... FOIA does not require the Foundation to: (1) Compile or create records solely for the purpose of satisfying a request for records; (2) Provide records not yet in existence, even if such records may be expected to come into existence at some future time; or (3) Restore records destroyed or otherwise disposed...
A promising new thermoelectric material - Ruthenium silicide
NASA Technical Reports Server (NTRS)
Vining, Cronin B.; Mccormack, Joseph A.; Zoltan, Andrew; Zoltan, Leslie D.
1991-01-01
Experimental and theoretical efforts directed toward increasing thermoelectric figure of merit values by a factor of 2 or 3 have been encouraging in several respects. An accurate and detailed theoretical model developed for n-type silicon-germanium (SiGe) indicates that ZT values several times higher than currently available are expected under certain conditions. These new, high ZT materials are expected to be significantly different from SiGe, but not unreasonably so. Several promising candidate materials have been identified which may meet the conditions required by theory. One such candidate, ruthenium silicide, currently under development at JPL, has been estimated to have the potential to exhibit figure of merit values 4 times higher than conventional SiGe materials. Recent results are summarized.
Updating the Inductee Delivery Schedule.
1987-03-01
deployed forces at risk with the anticipated opposing forces for the expected level of combat intensity. An estimate of the number of individuals who...identification of shortfalls in critical skills. It prescribes the anticipation of requirements and return of personnel resources to military control as...with the Time Phased Force Deployment Data lists the forces that will be deployed over time. Each unit is then assigned to a risk group (forces
Simulation Modeling of Software Development Processes
NASA Technical Reports Server (NTRS)
Calavaro, G. F.; Basili, V. R.; Iazeolla, G.
1996-01-01
A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.
Perspective: Hope for Health Equity
Rust, George
2017-01-01
Times like these test the soul. We are now working for health equity in a time of overt, aggressive opposition. Yet, hope in the face of overwhelming obstacles is the force that has driven most of the world’s progress toward equity and justice. Operationalizing real-world hope requires an affirmative vision, an expectation of success, broad coalitions taking action cohesively, and frequent measures of collective impact to drive rapid-cycle improvement. PMID:28439181
Perspective: Hope for Health Equity.
Rust, George
2017-01-01
Times like these test the soul. We are now working for health equity in a time of overt, aggressive opposition. Yet, hope in the face of overwhelming obstacles is the force that has driven most of the world's progress toward equity and justice. Operationalizing real-world hope requires an affirmative vision, an expectation of success, broad coalitions taking action cohesively, and frequent measures of collective impact to drive rapid-cycle improvement.
Physics objectives of PI3 spherical tokamak program
NASA Astrophysics Data System (ADS)
Howard, Stephen; Laberge, Michel; Reynolds, Meritt; O'Shea, Peter; Ivanov, Russ; Young, William; Carle, Patrick; Froese, Aaron; Epp, Kelly
2017-10-01
Achieving net energy gain with a Magnetized Target Fusion (MTF) system requires the initial plasma state to satisfy a set of performance goals, such as particle inventory (1021 ions), sufficient magnetic flux (0.3 Wb) to confine the plasma without MHD instability, and initial energy confinement time several times longer than the compression time. General Fusion (GF) is now constructing Plasma Injector 3 (PI3) to explore the physics of reactor-scale plasmas. Energy considerations lead us to design around an initial state of Rvessel = 1 m. PI3 will use fast coaxial helicity injection via a Marshall gun to create a spherical tokamak plasma, with no additional heating. MTF requires solenoid-free startup with no vertical field coils, and will rely on flux conservation by a metal wall. PI3 is 5x larger than SPECTOR so is expected to yield magnetic lifetime increase of 25x, while peak temperature of PI3 is expected to be similar (400-500 eV) Physics investigations will study MHD activity and the resistive and convective evolution of current, temperature and density profiles. We seek to understand the confinement physics, radiative loss, thermal and particle transport, recycling and edge physics of PI3.
Surgery clerkship orientation: evaluating temporal changes in student orientation needs.
O'Neill, Conor; Moore, Jesse; Callas, Peter
2016-08-01
Surgery clerkship students at our institution receive a standardized orientation covering objectives, requirements, grading, and expectations. Limited data exist regarding the student perceptions of this approach. Surveys were provided to students to rate the importance of orientation topics and their satisfaction with topic conclusion. Scores between student groupings over the clerkship year were analyzed with Student t tests and analysis of variance with Scheffe adjustments. Significant differences in the mean importance rating between topics exists (P < .0001) as well as among satisfaction scores for topics (P < .0005). Early clerkship students value course expectations higher than later students (P = .03). Early clerkship students want more time devoted to hospital tours and expectations compared with later students (31% vs 8%). Orientation needs for students change over the clerkship year. Beginning students prefer basic direction for time spent on the ward. Later students prefer information regarding shelf preparation. Surgery course directors can adapt the orientation based on the experience of clerkship students. Copyright © 2016 Elsevier Inc. All rights reserved.
Buljovčić, Z
2011-07-01
On 30 December 2008, the Regulation (EC) 1394/2007 on advanced therapy medicinal products (ATMPs) entered into force. Herewith the first EU-wide regulatory framework for ATMPs was established. It requires a central marketing authorisation application to the EMA (European Medicinal Agency). This new framework especially changes the code of regulatory practice for tissue engineered products (TEPs), as no registration procedure had been previously required for autologous TEPs. This also meant that no clinical proof of efficacy achieved by a pivotal clinical trial was necessary. Difficulties and their background as well as the vast requirements for product development that have to be addressed by small companies within a very short time frame are presented. Hereby, it is obvious that regulatory experience which is required to identify and implement the resulting implications was not in place yet and still had to be established. The lack of regulatory experience also resulted in difficulties with scientific advice preparation, expectations toward regulatory agencies, consultants, and transformation of regulatory requirements. Addressing the regulatory requirements within the transition period is even more difficult for entrepreneurs with products which are assigned for indications resulting in complex challenges to the trial design. Due to the enormous time pressure to generate data and due to the implied financial pressure, different adaptation strategies are evolving. In Germany the "hospital exemption" according to §4b AMG (German Medicinal Products Law) is of major importance. A reorientation toward acellular products and a slow down in development of new ATMP products is expected.
β-hCG resolution times during expectant management of tubal ectopic pregnancies.
Mavrelos, D; Memtsa, M; Helmy, S; Derdelis, G; Jauniaux, E; Jurkovic, D
2015-05-21
A subset of women with a tubal ectopic pregnancy can be safely managed expectantly. Expectant management involves a degree of disruption with hospital visits to determine serum β-hCG (β-human chorionic gonadotrophin) concentration until the pregnancy test becomes negative and expectant management is considered complete. The length of time required for the pregnancy test to become negative and the parameters that influence this interval have not been described. Information on the likely length of follow up would be useful for women considering expectant management of their tubal ectopic pregnancy. This was a retrospective study at a tertiary referral center in an inner city London Hospital. We included women who were diagnosed with a tubal ectopic pregnancy by transvaginal ultrasound between March 2009 and March 2014. During the study period 474 women were diagnosed with a tubal ectopic pregnancy and 256 (54 %) of them fulfilled our management criteria for expectant management. A total of 158 (33 %) women had successful expectant management and in those cases we recorded the diameter of the ectopic pregnancy (mm), the maximum serum β-hCG (IU/L) and levels during follow up until resolution as well as the interval to resolution (days). The median interval from maximum serum β-hCG concentration to resolution was 18.0 days (IQR 11.0-28.0). The maximum serum β-hCG concentration and the rate of decline of β-hCG were independently associated with the length of follow up. Women's age and size of ectopic pregnancy did not have significant effects on the length of follow up. Women undergoing expectant management of ectopic pregnancy can be informed that the likely length of follow up is under 3 weeks and that it positively correlates with initial β-hCG level at the time of diagnosis.
A user friendly database for use in ALARA job dose assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zodiates, A.M.; Willcock, A.
1995-03-01
The pressurized water reactor (PWR) design chosen for adoption by Nuclear Electric plc was based on the Westinghouse Standard Nuclear Unit Power Plant (SNUPPS). This design was developed to meet the United Kingdom requirements and these improvements are embodied in the Sizewell B plant which will start commercial operation in 1994. A user-friendly database was developed to assist the station in the dose and ALARP assessments of the work expected to be carried out during station operation and outage. The database stores the information in an easily accessible form and enables updating, editing, retrieval, and searches of the information. Themore » database contains job-related information such as job locations, number of workers required, job times, and the expected plant doserates. It also contains the means to flag job requirements such as requirements for temporary shielding, flushing, scaffolding, etc. Typical uses of the database are envisaged to be in the prediction of occupational doses, the identification of high collective and individual dose jobs, use in ALARP assessments, setting of dose targets, monitoring of dose control performance, and others.« less
First Contact: Expectations of Beginning Astronomy Students
NASA Astrophysics Data System (ADS)
Lacey, T. L.; Slater, T. F.
1999-05-01
Three hundred seven undergraduate students enrolled in Introductory Astronomy were surveyed at the beginning of class to determine their expectations for course content. The course serves as a survey of astronomy for non-science majors and is a distribution course for general education core requirements. The course has no prerequisites, meets three times each week for 50 minutes, and represents three semester credit hours. The university catalog describes the course with the title "PHYSICS 101 - Mysteries of the Sky" and the official course description is: a survey of the struggle to understand the Universe and our place therein. The structure, growth, methods, and limitations of science will be illustrated using the development of astronomy as a vehicle. Present day views of the Universe are presented. Two questions were asked as open response items: What made you decide to take this course? and What do you expect to learn in this course? The reasons that students cited to take the course, in order of frequency, were: interested in astronomy, interesting or fun sounding course, required general education fulfillment, recommendation by peer. Secondary reasons cited were required for major or minor, general interest in science, and was available in the schedule. Tertiary reasons listed were recommendation by advisor or orientation leader, inflate grade point average, and heard good things about the teacher. The students' expectations about what they would learn in the course were numerous. The most common objects listed, in order of frequency, were: stars, constellations, planets, galaxies, black holes, solar system, comets, galaxies, asteroids, moon, and Sun. More interesting were the aspects not specifically related to astronomy. These were weather, atmosphere, UFOs and the unexplained, generally things in the sky. A mid-course survey suggests that students expected to learn more constellations and that the topics would be less in-depth.
Rabideau, Dustin J; Pei, Pamela P; Walensky, Rochelle P; Zheng, Amy; Parker, Robert A
2018-02-01
The expected value of sample information (EVSI) can help prioritize research but its application is hampered by computational infeasibility, especially for complex models. We investigated an approach by Strong and colleagues to estimate EVSI by applying generalized additive models (GAM) to results generated from a probabilistic sensitivity analysis (PSA). For 3 potential HIV prevention and treatment strategies, we estimated life expectancy and lifetime costs using the Cost-effectiveness of Preventing AIDS Complications (CEPAC) model, a complex patient-level microsimulation model of HIV progression. We fitted a GAM-a flexible regression model that estimates the functional form as part of the model fitting process-to the incremental net monetary benefits obtained from the CEPAC PSA. For each case study, we calculated the expected value of partial perfect information (EVPPI) using both the conventional nested Monte Carlo approach and the GAM approach. EVSI was calculated using the GAM approach. For all 3 case studies, the GAM approach consistently gave similar estimates of EVPPI compared with the conventional approach. The EVSI behaved as expected: it increased and converged to EVPPI for larger sample sizes. For each case study, generating the PSA results for the GAM approach required 3 to 4 days on a shared cluster, after which EVPPI and EVSI across a range of sample sizes were evaluated in minutes. The conventional approach required approximately 5 weeks for the EVPPI calculation alone. Estimating EVSI using the GAM approach with results from a PSA dramatically reduced the time required to conduct a computationally intense project, which would otherwise have been impractical. Using the GAM approach, we can efficiently provide policy makers with EVSI estimates, even for complex patient-level microsimulation models.
A bootstrap based space-time surveillance model with an application to crime occurrences
NASA Astrophysics Data System (ADS)
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
Nilsson, Niclas; Minssen, Timo
2018-04-01
A common understanding of expectations and requirements is critical for boosting research-driven business opportunities in open innovation (OI) settings. Transparent communication requires common definitions and standards for OI to align the expectations of both parties. Here, we suggest a five-level classification system for OI models, reflecting the degree of openness. The aim of this classification system is to reduce contract negotiation complexity and times between two parties looking to engage in OI. Systematizing definitions and contractual terms for OI in the life sciences helps to reduce entry barriers and boosts collaborative value generation. By providing a contractual framework with predefined rules, science will be allowed to move more freely, thus maximizing the potential of OI. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
26 CFR 1.460-2 - Long-term manufacturing contracts.
Code of Federal Regulations, 2010 CFR
2010-04-01
... time normally required to design and manufacture the first unit of an item for which the taxpayer... manufacture a new type of industrial equipment. C reasonably expects the normal production period for this... 26 Internal Revenue 6 2010-04-01 2010-04-01 false Long-term manufacturing contracts. 1.460-2...
Learning Languages: Any Place, Any Time
ERIC Educational Resources Information Center
Day, Allyson
2014-01-01
The job description for a homeschool Spanish teacher has many of the requirements one would expect. The candidate must be fluent in Spanish, love working with children, be willing to work with all kinds of learners, communicate well with parents and students, and have great enthusiasm for teaching. The Home School Assistance Program (HSAP) classes…
ERIC Educational Resources Information Center
Thelen, Anja
2015-01-01
Enterprise Resource Planning (ERP) implementations are expensive, time-consuming, and often do not lead to the expected outcome of integrated IT systems. Many German universities are implementing ERP systems as Campus Management Systems (CMS) and a solution to any problem, need, or requirement the organization has. This exploratory case study…
Stocking the Toolbox: Ideas for Successful Facility Management
ERIC Educational Resources Information Center
Gadzikowski, Ann
2005-01-01
From snow removal to dishwasher repair, from pest control to playground renovations, there are countless demands on a child care director's time and attention. A child care director is required to juggle a wide variety of roles and expectations related to facility management, often with very little training or expertise in this area. Some child…
Analysis of Alternatives (AoA) Process Improvement Study
2016-12-01
stakeholders, and mapped the process activities and durations. We tasked the SAG members with providing the information required on case studies and...are the expected time saves/cost/risk of any changes? (3) Utilization of case studies for both “good” and “challenged” AoAs to identify lessons...16 4 CASE STUDIES
Reconsidering the Promise of Systemwide Innovation for Urban Districts.
ERIC Educational Resources Information Center
Hess, Frederick
1998-01-01
Many of the problems school reform is expected to solve are aggravated by the ways schools use reform. Macro-level innovations are rarely designed to work. Meaningful change requires time and focus. Reforms may be more about politics than about change. A constant churn of policy change and reform has become the norm. (SK)
Application of quadratic optimization to supersonic inlet control.
NASA Technical Reports Server (NTRS)
Lehtinen, B.; Zeller, J. R.
1972-01-01
This paper describes the application of linear stochastic optimal control theory to the design of the control system for the air intake, the inlet, of a supersonic air-breathing propulsion system. The controls must maintain a stable inlet shock position in the presence of random airflow disturbances and prevent inlet unstart. Two different linear time invariant controllers are developed. One is designed to minimize a nonquadratic index, the expected frequency of inlet unstart, and the other is designed to minimize the mean square value of inlet shock motion. The quadratic equivalence principle is used to obtain a linear controller that minimizes the nonquadratic index. The two controllers are compared on the basis of unstart prevention, control effort requirements, and frequency response. It is concluded that while controls designed to minimize unstarts are desirable in that the index minimized is physically meaningful, computation time required is longer than for the minimum mean square shock position approach. The simpler minimum mean square shock position solution produced expected unstart frequency values which were not significantly larger than those of the nonquadratic solution.
Magnetic Field Diagnostics and Spatio-Temporal Variability of the Solar Transition Region
NASA Astrophysics Data System (ADS)
Peter, H.
2013-12-01
Magnetic field diagnostics of the transition region from the chromosphere to the corona faces us with the problem that one has to apply extreme-ultraviolet (EUV) spectro-polarimetry. While for the coronal diagnostics techniques already exist in the form of infrared coronagraphy above the limb and radio observations on the disk, one has to investigate EUV observations for the transition region. However, so far the success of such observations has been limited, but various current projects aim to obtain spectro-polarimetric data in the extreme UV in the near future. Therefore it is timely to study the polarimetric signals we can expect from these observations through realistic forward modeling. We employ a 3D magneto-hydrodynamic (MHD) forward model of the solar corona and synthesize the Stokes I and Stokes V profiles of C iv (1548 Å). A signal well above 0.001 in Stokes V can be expected even if one integrates for several minutes to reach the required signal-to-noise ratio, and despite the rapidly changing intensity in the model (just as in observations). This variability of the intensity is often used as an argument against transition region magnetic diagnostics, which requires exposure times of minutes. However, the magnetic field is evolving much slower than the intensity, and therefore the degree of (circular) polarization remains rather constant when one integrates in time. Our study shows that it is possible to measure the transition region magnetic field if a polarimetric accuracy on the order of 0.001 can be reached, which we can expect from planned instrumentation.
Morton, Rachael L.; Snelling, Paul; Webster, Angela C.; Rose, John; Masterson, Rosemary; Johnson, David W.; Howard, Kirsten
2012-01-01
Background: For every patient with chronic kidney disease who undergoes renal-replacement therapy, there is one patient who undergoes conservative management of their disease. We aimed to determine the most important characteristics of dialysis and the trade-offs patients were willing to make in choosing dialysis instead of conservative care. Methods: We conducted a discrete choice experiment involving adults with stage 3–5 chronic kidney disease from eight renal clinics in Australia. We assessed the influence of treatment characteristics (life expectancy, number of visits to the hospital per week, ability to travel, time spent undergoing dialysis [i.e., time spent attached to a dialysis machine per treatment, measured in hours], time of day at which treatment occurred, availability of subsidized transport and flexibility of the treatment schedule) on patients’ preferences for dialysis versus conservative care. Results: Of 151 patients invited to participate, 105 completed our survey. Patients were more likely to choose dialysis than conservative care if dialysis involved an increased average life expectancy (odds ratio [OR] 1.84, 95% confidence interval [CI] 1.57–2.15), if they were able to dialyse during the day or evening rather than during the day only (OR 8.95, 95% CI 4.46–17.97), and if subsidized transport was available (OR 1.55, 95% CI 1.24–1.95). Patients were less likely to choose dialysis over conservative care if an increase in the number of visits to hospital was required (OR 0.70, 95% CI 0.56–0.88) and if there were more restrictions on their ability to travel (OR = 0.47, 95%CI 0.36–0.61). Patients were willing to forgo 7 months of life expectancy to reduce the number of required visits to hospital and 15 months of life expectancy to increase their ability to travel. Interpretation: Patients approaching end-stage kidney disease are willing to trade considerable life expectancy to reduce the burden and restrictions imposed by dialysis. PMID:22311947
2014-09-01
we contacted pointed out that their catchment area covers 147,000 square miles, and some of their caregivers live over 8 hours away, requiring...geographical area . A caregiver whose veteran is rated tier 2 receives the equivalent of 25 hours per week of the wage for a home health aide, and a...location we contacted told us that home visits to remote areas require long driving times, which are challenging to accommodate. Staff at one VAMC
Disaster imminent--Hurricane Hugo.
Guynn, J B
1990-04-01
Response to a disaster situation depends upon the type of circumstances presented. In situations where the disaster is the type that affects the hospital as well as a wide surrounding area directly, the hospital and pharmacy itself may be called upon to continue functioning for some period of time without outside assistance. The ability to function for prolonged periods of time requires the staff to focus on the job at hand and the administrative staff to provide security, compassion, and flexibility. Plans for a disaster of the nature of a hurricane require that attention be paid to staffing, medication inventories, supplies, and services being rendered. Recognition of the singular position occupied by a hospital in the community and the expectations of the local population require that hospitals and the pharmacy department have the ability to respond appropriately.
Neuroadaptive technology enables implicit cursor control based on medial prefrontal cortex activity.
Zander, Thorsten O; Krol, Laurens R; Birbaumer, Niels P; Gramann, Klaus
2016-12-27
The effectiveness of today's human-machine interaction is limited by a communication bottleneck as operators are required to translate high-level concepts into a machine-mandated sequence of instructions. In contrast, we demonstrate effective, goal-oriented control of a computer system without any form of explicit communication from the human operator. Instead, the system generated the necessary input itself, based on real-time analysis of brain activity. Specific brain responses were evoked by violating the operators' expectations to varying degrees. The evoked brain activity demonstrated detectable differences reflecting congruency with or deviations from the operators' expectations. Real-time analysis of this activity was used to build a user model of those expectations, thus representing the optimal (expected) state as perceived by the operator. Based on this model, which was continuously updated, the computer automatically adapted itself to the expectations of its operator. Further analyses showed this evoked activity to originate from the medial prefrontal cortex and to exhibit a linear correspondence to the degree of expectation violation. These findings extend our understanding of human predictive coding and provide evidence that the information used to generate the user model is task-specific and reflects goal congruency. This paper demonstrates a form of interaction without any explicit input by the operator, enabling computer systems to become neuroadaptive, that is, to automatically adapt to specific aspects of their operator's mindset. Neuroadaptive technology significantly widens the communication bottleneck and has the potential to fundamentally change the way we interact with technology.
Functional requirements for reward-modulated spike-timing-dependent plasticity.
Frémaux, Nicolas; Sprekeler, Henning; Gerstner, Wulfram
2010-10-06
Recent experiments have shown that spike-timing-dependent plasticity is influenced by neuromodulation. We derive theoretical conditions for successful learning of reward-related behavior for a large class of learning rules where Hebbian synaptic plasticity is conditioned on a global modulatory factor signaling reward. We show that all learning rules in this class can be separated into a term that captures the covariance of neuronal firing and reward and a second term that presents the influence of unsupervised learning. The unsupervised term, which is, in general, detrimental for reward-based learning, can be suppressed if the neuromodulatory signal encodes the difference between the reward and the expected reward-but only if the expected reward is calculated for each task and stimulus separately. If several tasks are to be learned simultaneously, the nervous system needs an internal critic that is able to predict the expected reward for arbitrary stimuli. We show that, with a critic, reward-modulated spike-timing-dependent plasticity is capable of learning motor trajectories with a temporal resolution of tens of milliseconds. The relation to temporal difference learning, the relevance of block-based learning paradigms, and the limitations of learning with a critic are discussed.
Narrative event boundaries, reading times, and expectation.
Pettijohn, Kyle A; Radvansky, Gabriel A
2016-10-01
During text comprehension, readers create mental representations of the described events, called situation models. When new information is encountered, these models must be updated or new ones created. Consistent with the event indexing model, previous studies have shown that when readers encounter an event shift, reading times often increase. However, such increases are not consistently observed. This paper addresses this inconsistency by examining the extent to which reading-time differences observed at event shifts reflect an unexpectedness in the narrative rather than processes involved in model updating. In two reassessments of prior work, event shifts known to increase reading time were rated as less expected, and expectedness ratings significantly predicted reading time. In three new experiments, participants read stories in which an event shift was or was not foreshadowed, thereby influencing expectedness of the shift. Experiment 1 revealed that readers do not expect event shifts, but foreshadowing eliminates this. Experiment 2 showed that foreshadowing does not affect identification of event shifts. Finally, Experiment 3 found that, although reading times increased when an event shift was not foreshadowed, they were not different from controls when it was. Moreover, responses to memory probes were slower following an event shift regardless of foreshadowing, suggesting that situation model updating had taken place. Overall, the results support the idea that previously observed reading time increases at event shifts reflect, at least in part, a reader's unexpected encounter with a shift rather than an increase in processing effort required to update a situation model.
College Press and Student Fit.
ERIC Educational Resources Information Center
Neumann, William
Six generalizations are offered regarding the collective requirements and expectations that colleges and universities impose on, or expect of, their students. (1) Colleges and universities in varying degrees expect and require students to demonstrate "basic academic skills" in reading, writing, and mathematics. Students must also learn…
Designing a Multi-Petabyte Database for LSST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becla, Jacek; Hanushevsky, Andrew; Nikolaev, Sergei
2007-01-10
The 3.2 giga-pixel LSST camera will produce approximately half a petabyte of archive images every month. These data need to be reduced in under a minute to produce real-time transient alerts, and then added to the cumulative catalog for further analysis. The catalog is expected to grow about three hundred terabytes per year. The data volume, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require innovative techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on a database for catalogs and metadata. Several database systems are beingmore » evaluated to understand how they perform at these data rates, data volumes, and access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, results to date from evaluating available database technologies against LSST requirements, and the proposed database architecture to meet the data challenges.« less
2010-01-01
Background Industry standards provide rigorous descriptions of required data presentation, with the aim of ensuring compatibility across different clinical studies. However despite their crucial importance, these standards are often not used as expected in the development of clinical research. The reasons for this lack of compliance could be related to the high cost and time-intensive nature of the process of data standards implementation. The objective of this study was to evaluate the value of the extra time and cost required for different levels of data standardisation and the likelihood of researchers to comply with these levels. Since we believe that the cost and time necessary for the implementation of data standards can change over time, System Dynamics (SD) analysis was used to investigate how these variables interact and influence the adoption of data standards by clinical researchers. Methods Three levels of data standards implementation were defined through focus group discussion involving four clinical research investigators. Ten Brazilian and eighteen American investigators responded to an online questionnaire which presented possible standards implementation scenarios, with respondents asked to choose one of two options available in each scenario. A random effects ordered probit model was used to estimate the effect of cost and time on investigators' willingness to adhere to data standards. The SD model was used to demonstrate the relationship between degrees of data standardisation and subsequent variation in cost and time required to start the associated study. Results A preference for low cost and rapid implementation times was observed, with investigators more likely to incur costs than to accept a time delay in project start-up. SD analysis indicated that although initially extra time and cost are necessary for clinical study standardisation, there is a decrease in both over time. Conclusions Future studies should explore ways of creating mechanisms which decrease the time and cost associated with standardisation processes. In addition, the fact that the costs and time necessary for data standards implementation decrease with time should be made known to the wider research community. Policy makers should attempt to match their data standardisation policies better with the expectations of researchers. PMID:21194455
Cofiel, Luciana; Zammar, Guilherme R; Zaveri, Amrapali J; Shah, Jatin Y; Carvalho, Elias; Nahm, Meredith; Kesselring, Gustavo; Pietrobon, Ricardo
2010-12-31
Industry standards provide rigorous descriptions of required data presentation, with the aim of ensuring compatibility across different clinical studies. However despite their crucial importance, these standards are often not used as expected in the development of clinical research. The reasons for this lack of compliance could be related to the high cost and time-intensive nature of the process of data standards implementation. The objective of this study was to evaluate the value of the extra time and cost required for different levels of data standardisation and the likelihood of researchers to comply with these levels. Since we believe that the cost and time necessary for the implementation of data standards can change over time, System Dynamics (SD) analysis was used to investigate how these variables interact and influence the adoption of data standards by clinical researchers. Three levels of data standards implementation were defined through focus group discussion involving four clinical research investigators. Ten Brazilian and eighteen American investigators responded to an online questionnaire which presented possible standards implementation scenarios, with respondents asked to choose one of two options available in each scenario. A random effects ordered probit model was used to estimate the effect of cost and time on investigators' willingness to adhere to data standards. The SD model was used to demonstrate the relationship between degrees of data standardisation and subsequent variation in cost and time required to start the associated study. A preference for low cost and rapid implementation times was observed, with investigators more likely to incur costs than to accept a time delay in project start-up. SD analysis indicated that although initially extra time and cost are necessary for clinical study standardisation, there is a decrease in both over time. Future studies should explore ways of creating mechanisms which decrease the time and cost associated with standardisation processes. In addition, the fact that the costs and time necessary for data standards implementation decrease with time should be made known to the wider research community. Policy makers should attempt to match their data standardisation policies better with the expectations of researchers.
2011-06-07
Lost Duty Time 6 standardized residuals of cells were examined to determine which cells had observed counts sizably different from expected counts...exploratory analysis) was used as the criterion to indicate that a cell had more (positive residual) or less (negative residual) observed events than...and supplies. These activities require lower body strength, stamina , and core strength that would be impaired by injuries to the lower extremities
Lessons from Coronagraphic Imaging with HST that may apply to JWST
NASA Astrophysics Data System (ADS)
Grady, C. A.; Hines, Dean C.; Schneider, Glenn; McElwain, Michael W.
2017-06-01
One of the major capabilities offered by JWST is coronagraphic imaging from space, covering the near through mid-IR and optimized for study of planet formation and the evolution of planetary systems. Planning for JWST has resulted in expectations for instrument performance, observation strategies and data reduction approaches. HST with 20 years of coronagraphic imaging offers some experience which may be useful to those planning for JWST. 1) Real astronomical sources do not necessarily conform to expectations. Debris disks may be accompanied by more distant material, and some systems may be conspicuous in scattered light when offering only modest IR excesses. Proto-planetary disks are not constantly illuminated, and thus a single epoch observation of the source may not be sufficient to reveal everything about it. 2) The early expectation with NICMOS was that shallow, 2-roll observations would reveal a wealth of debris disks imaged in scattered light, and that only a limited set of PSF observations would be required. Instead, building up a library of spatially resolved disks in scattered light has proven to require alternate observing strategies, is still on-going, and has taken far longer than expected. 3) A wealth of coronagraphic options with an instrument may not be scientifically informative, unless there is a similar time investment in acquisition of calibration data in support of the science observations. 4) Finally, no one anticipated what can be gleaned from coronagraphic imaging. We should expect similar, unexpected, and ultimately revolutionary discoveries with JWST.
2016-09-01
the UAV’s reliability in fulfilling the mission as well as the build- time of the UAV. 14. SUBJECT TERMS design , print and operate, DPO...previously. There are opportunities to work on the design of the UAV to reduce the cognitive workload of the service member and time required to “print” and...the need arises to tailor the UAV for the specific mission. The modification of an existing design is expected to take a much shorter time than the
Error analysis of real time and post processed or bit determination of GFO using GPS tracking
NASA Technical Reports Server (NTRS)
Schreiner, William S.
1991-01-01
The goal of the Navy's GEOSAT Follow-On (GFO) mission is to map the topography of the world's oceans in both real time (operational) and post processed modes. Currently, the best candidate for supplying the required orbit accuracy is the Global Positioning System (GPS). The purpose of this fellowship was to determine the expected orbit accuracy for GFO in both the real time and post-processed modes when using GPS tracking. This report presents the work completed through the ending date of the fellowship.
12 CFR 238.53 - Prescribed services and activities of savings and loan holding companies.
Code of Federal Regulations, 2014 CFR
2014-01-01
... reasons and the date by which the Board expects to act. (3)(i) Required time limit for System action. The... and loan holding companies. 238.53 Section 238.53 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM (CONTINUED) SAVINGS AND LOAN HOLDING COMPANIES...
12 CFR 238.53 - Prescribed services and activities of savings and loan holding companies.
Code of Federal Regulations, 2012 CFR
2012-01-01
... reasons and the date by which the Board expects to act. (3)(i) Required time limit for System action. The... and loan holding companies. 238.53 Section 238.53 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM (CONTINUED) SAVINGS AND LOAN HOLDING COMPANIES...
12 CFR 238.53 - Prescribed services and activities of savings and loan holding companies.
Code of Federal Regulations, 2013 CFR
2013-01-01
... reasons and the date by which the Board expects to act. (3)(i) Required time limit for System action. The... and loan holding companies. 238.53 Section 238.53 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM (CONTINUED) SAVINGS AND LOAN HOLDING COMPANIES...
30 CFR 254.26 - What information must I include in the “Worst case discharge scenario” appendix?
Code of Federal Regulations, 2012 CFR
2012-07-01
... ENVIRONMENTAL ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL-SPILL RESPONSE REQUIREMENTS FOR FACILITIES LOCATED SEAWARD OF THE COAST LINE Oil-Spill Response Plans for Outer Continental Shelf Facilities § 254.26... the facility that oil could move in a time period that it reasonably could be expected to persist in...
30 CFR 254.26 - What information must I include in the “Worst case discharge scenario” appendix?
Code of Federal Regulations, 2014 CFR
2014-07-01
... ENVIRONMENTAL ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL-SPILL RESPONSE REQUIREMENTS FOR FACILITIES LOCATED SEAWARD OF THE COAST LINE Oil-Spill Response Plans for Outer Continental Shelf Facilities § 254.26... the facility that oil could move in a time period that it reasonably could be expected to persist in...
30 CFR 254.26 - What information must I include in the “Worst case discharge scenario” appendix?
Code of Federal Regulations, 2013 CFR
2013-07-01
... ENVIRONMENTAL ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL-SPILL RESPONSE REQUIREMENTS FOR FACILITIES LOCATED SEAWARD OF THE COAST LINE Oil-Spill Response Plans for Outer Continental Shelf Facilities § 254.26... the facility that oil could move in a time period that it reasonably could be expected to persist in...
The Relevance of Business Law Education for Future Accountants: A New Zealand Perspective
ERIC Educational Resources Information Center
McCourt, Alison; Low, Mary; Tappin, Ella
2013-01-01
The importance of business law education is emphasised by the fact that there is a compulsory commercial law topic in the academic requirements for a chartered accountants' programme of study. However, researchers over time have pointed out that there was a gap between the legal awareness and understanding expected of graduate accountants and the…
Code of Federal Regulations, 2013 CFR
2013-07-01
..., complete sprinkler protection can be expected to prevent flashover in the room of fire origin, limit fire... the times required for egress. If a combination of fire protection systems provides a margin of safety... Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL...
Code of Federal Regulations, 2012 CFR
2012-01-01
..., complete sprinkler protection can be expected to prevent flashover in the room of fire origin, limit fire... the times required for egress. If a combination of fire protection systems provides a margin of safety... Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL...
Code of Federal Regulations, 2014 CFR
2014-01-01
..., complete sprinkler protection can be expected to prevent flashover in the room of fire origin, limit fire... the times required for egress. If a combination of fire protection systems provides a margin of safety... Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL...
Code of Federal Regulations, 2011 CFR
2011-01-01
..., complete sprinkler protection can be expected to prevent flashover in the room of fire origin, limit fire... the times required for egress. If a combination of fire protection systems provides a margin of safety... Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL...
Code of Federal Regulations, 2011 CFR
2011-04-01
... ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES MEDICAL DEVICE REPORTING... event or a period of time equivalent to the expected life of the device, whichever is greater. If the..., electronic, or oral communication, either received or generated by you, that alleges deficiencies related to...
Code of Federal Regulations, 2010 CFR
2010-04-01
... ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES MEDICAL DEVICE REPORTING... event or a period of time equivalent to the expected life of the device, whichever is greater. If the..., electronic, or oral communication, either received or generated by you, that alleges deficiencies related to...
Treatments options for alopecia.
Iorizzo, Matilde; Tosti, Antonella
2015-01-01
Hair disorders have a very high social and psychological impact. Treatment is often frustrating and time-consuming both for the patients and the clinicians and requires special skills and expertise. This paper aims to provide an overview of available treatments for the most common forms of alopecia in adults (androgenetic alopecia [AGA], alopecia areata and cicatricial alopecias) after reviewing the literature in PubMed, Google Scholar and ClinicalTrial.gov. Before starting treatment, it is very important to confirm diagnosis and discuss patient's expectations. Treatment of hair disorders requires time and first results are usually visible a few months after beginning of therapy. Treatment of most hair disorders is mostly not evidenced-based as randomized controlled trials are available only for AGA.
Navarrete-Dechent, Cristián; Bajaj, Shirin; Marghoob, Ashfaq A; Marchetti, Michael A
2015-06-01
Dermatophytoses are common skin infections. Traditional diagnostic tests such as skin scrapings for light microscopy examination, fungal cultures and biopsies remain imperfect due to false-negative test results, cost, time required to perform the procedure, time delays in test results and/or a requirement for an invasive procedure. Herein, we present a case of an 80-year-old female whose tinea incognito was non-invasively diagnosed within seconds using handheld reflectance confocal microscopy (RCM). As non-invasive skin imaging continues to improve, we expect light-based office microscopy to be replaced with technologies such as RCM, which has multiple and continually expanding diagnostic applications. © 2015 Blackwell Verlag GmbH.
Assessing and Ensuring GOES-R Magnetometer Accuracy
NASA Technical Reports Server (NTRS)
Kronenwetter, Jeffrey; Carter, Delano R.; Todirita, Monica; Chu, Donald
2016-01-01
The GOES-R magnetometer accuracy requirement is 1.7 nanoteslas (nT). During quiet times (100 nT), accuracy is defined as absolute mean plus 3 sigma. During storms (300 nT), accuracy is defined as absolute mean plus 2 sigma. To achieve this, the sensor itself has better than 1 nT accuracy. Because zero offset and scale factor drift over time, it is also necessary to perform annual calibration maneuvers. To predict performance, we used covariance analysis and attempted to corroborate it with simulations. Although not perfect, the two generally agree and show the expected behaviors. With the annual calibration regimen, these predictions suggest that the magnetometers will meet their accuracy requirements.
Radiology metrics for safe use and regulatory compliance with CT imaging
NASA Astrophysics Data System (ADS)
Paden, Robert; Pavlicek, William
2018-03-01
The MACRA Act creates a Merit-Based Payment System, with monitoring patient exposure from CT providing one possible quality metric for meeting merit requirements. Quality metrics are also required by The Joint Commission, ACR, and CMS as facilities are tasked to perform reviews of CT irradiation events outside of expected ranges, review protocols for appropriateness, and validate parameters for low dose lung cancer screening. In order to efficiently collect and analyze irradiation events and associated DICOM tags, all clinical CT devices were DICOM connected to a parser which extracted dose related information for storage into a database. Dose data from every exam is compared to the appropriate external standard exam type. AAPM recommended CTDIvol values for head and torso, adult and pediatrics, coronary and perfusion exams are used for this study. CT doses outside the expected range were automatically formatted into a report for analysis and review documentation. CT Technologist textual content, the reason for proceeding with an irradiation above the recommended threshold, is captured for inclusion in the follow up reviews by physics staff. The use of a knowledge based approach in labeling individual protocol and device settings is a practical solution resulting in efficiency of analysis and review. Manual methods would require approximately 150 person-hours for our facility, exclusive of travel time and independent of device availability. An efficiency of 89% time savings occurs through use of this informatics tool including a low dose CT comparison review and low dose lung cancer screening requirements set forth by CMS.
A Comparative Typology of Student and Institutional Expectations of Online Faculty
ERIC Educational Resources Information Center
Shaw, Melanie E.; Clowes, Meena C.; Burrus, Scott W. M.
2017-01-01
Online faculty must uphold institutional expectations for their performance. Typically, online institutions have specific guidelines for faculty-to-student interactions; yet, student expectations of faculty may not necessarily align with institutional requirements. This study included a typological analysis of institutional requirements for online…
NASA Technical Reports Server (NTRS)
West, M. E.
1992-01-01
A real-time estimation filter which reduces sensitivity to system variations and reduces the amount of preflight computation is developed for the instrument pointing subsystem (IPS). The IPS is a three-axis stabilized platform developed to point various astronomical observation instruments aboard the shuttle. Currently, the IPS utilizes a linearized Kalman filter (LKF), with premission defined gains, to compensate for system drifts and accumulated attitude errors. Since the a priori gains are generated for an expected system, variations result in a suboptimal estimation process. This report compares the performance of three real-time estimation filters with the current LKF implementation. An extended Kalman filter and a second-order Kalman filter are developed to account for the system nonlinearities, while a linear Kalman filter implementation assumes that the nonlinearities are negligible. The performance of each of the four estimation filters are compared with respect to accuracy, stability, settling time, robustness, and computational requirements. It is shown, that for the current IPS pointing requirements, the linear Kalman filter provides improved robustness over the LKF with less computational requirements than the two real-time nonlinear estimation filters.
Second-order contrast based on the expectation of effort and reinforcement.
Clement, Tricia S; Zentall, Thomas R
2002-01-01
Pigeons prefer signals for reinforcement that require greater effort (or time) to obtain over those that require less effort to obtain (T. S. Clement, J. Feltus, D. H. Kaiser, & T. R. Zentall, 2000). Preference was attributed to contrast (or to the relatively greater improvement in conditions) produced by the appearance of the signal when it was preceded by greater effort. In Experiment 1, the authors of the present study demonstrated that the expectation of greater effort was sufficient to produce such a preference (a second-order contrast effect). In Experiments 2 and 3, low versus high probability of reinforcement was substituted for high versus low effort, respectively, with similar results. In Experiment 3, the authors found that the stimulus preference could be attributed to positive contrast (when the discriminative stimuli represented an improvement in the probability of reinforcement) and perhaps also negative contrast (when the discriminative stimuli represented reduction in the probability of reinforcement).
Scientific Rationale and Requirements for a Global Seismic Network on Mars
NASA Technical Reports Server (NTRS)
Solomon, Sean C.; Anderson, Don L.; Banerdt, W. Bruce; Butler, Rhett G.; Davis, Paul M.; Duennebier, Frederick K.; Nakamura, Yosio; Okal, Emile A.; Phillips, Roger J.
1991-01-01
Following a brief overview of the mission concepts for a Mars Global Network Mission as of the time of the workshop, we present the principal scientific objectives to be achieved by a Mars seismic network. We review the lessons for extraterrestrial seismology gained from experience to date on the Moon and on Mars. An important unknown on Mars is the expected rate of seismicity, but theoretical expectations and extrapolation from lunar experience both support the view that seismicity rates, wave propagation characteristics, and signal-to-noise ratios are favorable to the collection of a scientifically rich dataset during the multiyear operation of a global seismic experiment. We discuss how particular types of seismic waves will provide the most useful information to address each of the scientific objectives, and this discussion provides the basis for a strategy for station siting. Finally, we define the necessary technical requirements for the seismic stations.
Jung, Jinwook; Lee, Habeom; Ha, Inho; Cho, Hyunmin; Kim, Kyun Kyu; Kwon, Jinhyeong; Won, Phillip; Hong, Sukjoon; Ko, Seung Hwan
2017-12-27
Future electronics are expected to develop into wearable forms, and an adequate stretchability is required for the forthcoming wearable electronics considering various motions occurring in human body. Along with stretchability, transparency can increase both the functionality and esthetic features in future wearable electronics. In this study, we demonstrate, for the first time, a highly stretchable and transparent electromagnetic interference shielding layer for wearable electronic applications with silver nanowire percolation network on elastic poly(dimethylsiloxane) substrate. The proposed stretchable and transparent electromagnetic interference shielding layer shows a high electromagnetic wave shielding effectiveness even under a high tensile strain condition. It is expected for the silver nanowire percolation network-based electromagnetic interference shielding layer to be beyond the conventional electromagnetic interference shielding materials and to broaden its application range to various fields that require optical transparency or nonplanar surface environment, such as biological system, human skin, and wearable electronics.
Eco-efficiency evaluation of a smart window prototype.
Syrrakou, E; Papaefthimiou, S; Yianoulis, P
2006-04-15
An eco-efficiency analysis was conducted using indicators suitably defined to evaluate the performance of an electrochromic window acting as an energy saving component in buildings. Combining the indicators for various parameters (control scenario, expected lifetime, climatic type, purchase cost) significant conclusions are drawn for the development and the potential applications of the device compared to other commercial fenestration products. The reduction of the purchase cost (to 200 euros/m2) and the increase of the lifetime (above 15 years) are the two main targets for achieving both cost and environmental efficiency. An electrochromic device, implemented in cooling dominated areas and operated with an optimum control strategy for the maximum expected lifetime (25 years), can reduce the building energy requirements by 52%. Furthermore, the total energy savings provided will be 33 times more than the energy required for its production while the emission of 615 kg CO2 equivalent per electrochromic glazing unit can be avoided.
van Ede, Freek; Niklaus, Marcel; Nobre, Anna C
2017-01-11
Although working memory is generally considered a highly dynamic mnemonic store, popular laboratory tasks used to understand its psychological and neural mechanisms (such as change detection and continuous reproduction) often remain relatively "static," involving the retention of a set number of items throughout a shared delay interval. In the current study, we investigated visual working memory in a more dynamic setting, and assessed the following: (1) whether internally guided temporal expectations can dynamically and reversibly prioritize individual mnemonic items at specific times at which they are deemed most relevant; and (2) the neural substrates that support such dynamic prioritization. Participants encoded two differently colored oriented bars into visual working memory to retrieve the orientation of one bar with a precision judgment when subsequently probed. To test for the flexible temporal control to access and retrieve remembered items, we manipulated the probability for each of the two bars to be probed over time, and recorded EEG in healthy human volunteers. Temporal expectations had a profound influence on working memory performance, leading to faster access times as well as more accurate orientation reproductions for items that were probed at expected times. Furthermore, this dynamic prioritization was associated with the temporally specific attenuation of contralateral α (8-14 Hz) oscillations that, moreover, predicted working memory access times on a trial-by-trial basis. We conclude that attentional prioritization in working memory can be dynamically steered by internally guided temporal expectations, and is supported by the attenuation of α oscillations in task-relevant sensory brain areas. In dynamic, everyday-like, environments, flexible goal-directed behavior requires that mental representations that are kept in an active (working memory) store are dynamic, too. We investigated working memory in a more dynamic setting than is conventional, and demonstrate that expectations about when mnemonic items are most relevant can dynamically and reversibly prioritize these items in time. Moreover, we uncover a neural substrate of such dynamic prioritization in contralateral visual brain areas and show that this substrate predicts working memory retrieval times on a trial-by-trial basis. This places the experimental study of working memory, and its neuronal underpinnings, in a more dynamic and ecologically valid context, and provides new insights into the neural implementation of attentional prioritization within working memory. Copyright © 2017 van Ede et al.
Hua, Yongzhao; Dong, Xiwang; Li, Qingdong; Ren, Zhang
2017-05-18
This paper investigates the time-varying formation robust tracking problems for high-order linear multiagent systems with a leader of unknown control input in the presence of heterogeneous parameter uncertainties and external disturbances. The followers need to accomplish an expected time-varying formation in the state space and track the state trajectory produced by the leader simultaneously. First, a time-varying formation robust tracking protocol with a totally distributed form is proposed utilizing the neighborhood state information. With the adaptive updating mechanism, neither any global knowledge about the communication topology nor the upper bounds of the parameter uncertainties, external disturbances and leader's unknown input are required in the proposed protocol. Then, in order to determine the control parameters, an algorithm with four steps is presented, where feasible conditions for the followers to accomplish the expected time-varying formation tracking are provided. Furthermore, based on the Lyapunov-like analysis theory, it is proved that the formation tracking error can converge to zero asymptotically. Finally, the effectiveness of the theoretical results is verified by simulation examples.
Daviaud, Emmanuelle; Chopra, Mickey
2008-01-01
To quantify staff requirements in primary health care facilities in South Africa through an adaptation of the WHO workload indicator of staff needs tool. We use a model to estimate staffing requirements at primary health care facilities. The model integrates several empirically-based assumptions including time and type of health worker required for each type of consultation, amount of management time required, amount of clinical support required and minimum staff requirements per type of facility. We also calculate the number of HIV-related consultations per district. The model incorporates type of facility, monthly travelling time for mobile clinics, opening hours per week, yearly activity and current staffing and calculates the expected staffing per category of staff per facility and compares it to the actual staffing. Across all the districts there is either an absence of doctors visiting clinics or too few doctors to cover the opening times of community health centres. Overall the number of doctors is only 7% of the required amount. There is 94% of the required number of professional nurses but with wide variations between districts, with a few districts having excesses while most have shortages. The number of enrolled nurses is 60% of what it should be. There are 17% too few enrolled nurse assistants. Across all districts there is wide variation in staffing levels between facilities leading to inefficient use of professional staff. The application of an adapted WHO workload tool identified important human resource planning issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Margaret; Fujita, K. Sydny
Regulatory impact assessment is formally required by the U.S. and many other nations in order to help governments weigh the costs and benefits of proposed regulations, particularly as they compare to those of alternative actions and other government priorities. 1 One of the “best practices” of regulatory impact assessments, as established by the OECD, is to use estimates of costs that are grounded in economic theory. Economic theory indicates that changes in compliance costs should be expected over time as a result of factors related to technological innovation. But many U.S. regulatory impact assessments have traditionally employed a practice thatmore » is in conflict with this expectation: they take current estimates of the costs of complying with a proposed regulation and project that those costs will remain unchanged over the full time period that the regulation would be in effect.« less
NASA Technical Reports Server (NTRS)
2009-01-01
Since leaving 'Victoria Crater,' Opportunity has picked up the pace of driving. In the 90 sols (Martian days) since exiting the crater, Opportunity has driven more than 1,800 meters (1.1 miles), three times the distance that was required for the original prime mission. Scientists expect to encounter younger rocks the farther south the rover travels. They also expect to find small rocks ejected onto the landscape during formation of nearby craters. To reach these things, the rover must avoid sand traps as much as possible. Opportunity acquired this mosaic with the navigation camera on the rover's 1,683rd Martian day, or sol (Oct. 18, 2008), of exploration.Sub-daily Statistical Downscaling of Meteorological Variables Using Neural Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Jitendra; Brooks, Bjørn-Gustaf J.; Thornton, Peter E
2012-01-01
A new open source neural network temporal downscaling model is described and tested using CRU-NCEP reanal ysis and CCSM3 climate model output. We downscaled multiple meteorological variables in tandem from monthly to sub-daily time steps while also retaining consistent correlations between variables. We found that our feed forward, error backpropagation approach produced synthetic 6 hourly meteorology with biases no greater than 0.6% across all variables and variance that was accurate within 1% for all variables except atmospheric pressure, wind speed, and precipitation. Correlations between downscaled output and the expected (original) monthly means exceeded 0.99 for all variables, which indicates thatmore » this approach would work well for generating atmospheric forcing data consistent with mass and energy conserved GCM output. Our neural network approach performed well for variables that had correlations to other variables of about 0.3 and better and its skill was increased by downscaling multiple correlated variables together. Poor replication of precipitation intensity however required further post-processing in order to obtain the expected probability distribution. The concurrence of precipitation events with expected changes in sub ordinate variables (e.g., less incident shortwave radiation during precipitation events) were nearly as consistent in the downscaled data as in the training data with probabilities that differed by no more than 6%. Our downscaling approach requires training data at the target time step and relies on a weak assumption that climate variability in the extrapolated data is similar to variability in the training data.« less
Introduction to the Security Engineering Risk Analysis (SERA) Framework
2014-11-01
military aircraft has increased from 8% to 80%. At the same time, the size of software in military aircraft has grown from 1,000 lines of code in the F...4A to 1.7 million lines of code in the F-22. This growth trend is expected to con- tinue over time [NASA 2009]. As software exerts more control of...their root causes can be traced to the software’s requirements, architecture, design, or code . Studies have shown that the cost of addressing a software
Reliability analysis based on the losses from failures.
Todinov, M T
2006-04-01
The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the early-life failures region and the expected losses given failure characterizing the corresponding time intervals. For complex systems whose components are not logically arranged in series, discrete simulation algorithms and software have been created for determining the losses from failures in terms of expected lost production time, cost of intervention, and cost of replacement. Different system topologies are assessed to determine the effect of modifications of the system topology on the expected losses from failures. It is argued that the reliability allocation in a production system should be done to maximize the profit/value associated with the system. Consequently, a method for setting reliability requirements and reliability allocation maximizing the profit by minimizing the total cost has been developed. Reliability allocation that maximizes the profit in case of a system consisting of blocks arranged in series is achieved by determining for each block individually the reliabilities of the components in the block that minimize the sum of the capital, operation costs, and the expected losses from failures. A Monte Carlo simulation based net present value (NPV) cash-flow model has also been proposed, which has significant advantages to cash-flow models based on the expected value of the losses from failures per time interval. Unlike these models, the proposed model has the capability to reveal the variation of the NPV due to different number of failures occurring during a specified time interval (e.g., during one year). The model also permits tracking the impact of the distribution pattern of failure occurrences and the time dependence of the losses from failures.
Assessing the impact of heart failure specialist services on patient populations.
Lyratzopoulos, Georgios; Cook, Gary A; McElduff, Patrick; Havely, Daniel; Edwards, Richard; Heller, Richard F
2004-05-24
The assessment of the impact of healthcare interventions may help commissioners of healthcare services to make optimal decisions. This can be particularly the case if the impact assessment relates to specific patient populations and uses timely local data. We examined the potential impact on readmissions and mortality of specialist heart failure services capable of delivering treatments such as b-blockers and Nurse-Led Educational Intervention (N-LEI). Statistical modelling of prevented or postponed events among previously hospitalised patients, using estimates of: treatment uptake and contraindications (based on local audit data); treatment effectiveness and intolerance (based on literature); and annual number of hospitalization per patient and annual risk of death (based on routine data). Optimal treatment uptake among eligible but untreated patients would over one year prevent or postpone 11% of all expected readmissions and 18% of all expected deaths for spironolactone, 13% of all expected readmisisons and 22% of all expected deaths for b-blockers (carvedilol) and 20% of all expected readmissions and an uncertain number of deaths for N-LEI. Optimal combined treatment uptake for all three interventions during one year among all eligible but untreated patients would prevent or postpone 37% of all expected readmissions and a minimum of 36% of all expected deaths. In a population of previously hospitalised patients with low previous uptake of b-blockers and no uptake of N-LEI, optimal combined uptake of interventions through specialist heart failure services can potentially help prevent or postpone approximately four times as many readmissions and a minimum of twice as many deaths compared with simply optimising uptake of spironolactone (not necessarily requiring specialist services). Examination of the impact of different heart failure interventions can inform rational planning of relevant healthcare services.
Stolyarova, Alexandra; Izquierdo, Alicia
2017-01-01
We make choices based on the values of expected outcomes, informed by previous experience in similar settings. When the outcomes of our decisions consistently violate expectations, new learning is needed to maximize rewards. Yet not every surprising event indicates a meaningful change in the environment. Even when conditions are stable overall, outcomes of a single experience can still be unpredictable due to small fluctuations (i.e., expected uncertainty) in reward or costs. In the present work, we investigate causal contributions of the basolateral amygdala (BLA) and orbitofrontal cortex (OFC) in rats to learning under expected outcome uncertainty in a novel delay-based task that incorporates both predictable fluctuations and directional shifts in outcome values. We demonstrate that OFC is required to accurately represent the distribution of wait times to stabilize choice preferences despite trial-by-trial fluctuations in outcomes, whereas BLA is necessary for the facilitation of learning in response to surprising events. DOI: http://dx.doi.org/10.7554/eLife.27483.001 PMID:28682238
Out-of-Time Beam Extinction in the MU2E Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prebys, E. J.; Werkema, S.
The Mu2e Experiment at Fermilab will search for the conversion of a muon to an electron in the field of an atomic nucleus with unprecedented sensitivity. The experiment requires a beam consisting of proton bunches 250 ns FW long, separated by 1.7more » $$\\mu$$ sec, with no out-of-time protons at the $$10^{10}$$ fractional level. Satisfying this "extinction" requirement is very challenging. The formation of the bunches is expected to result in an extinction on the order of $10^5$. The remaining extinction will be accomplished by a system of resonant magnets and collimators, configured such that only in-time beam is delivered to the experiment. Our simulations show that the total extinction achievable by the system is on the order of $$10^{12}$$, with an efficiency for transmitting in-time beam of 99.6%.« less
Moosavi, Ahmad; Mohseni, Mohammad; Ziaiifar, Hajar; Azami-Aghdash, Saber; Gharasi Manshadi, Mahdi; Rezapour, Aziz
2017-04-01
Students' view is an important factor in assessing the quality of universities. Servqual pattern is regarded as the most prominent for services quality measurement. This study aimed to review systematically studies that investigated the quality of educational services. A systematic review and meta-analysis of studies evaluating students' viewpoint about quality of educational services were conducted. Required data were collected from PubMed, Embase, Scopus, Science Direct, Google Scholar, SID, Magiran, and Iranmedex, without time restriction. Computer software CMA, ver. 2 was applied to estimate the total mean score of students' perception and expectation of services quality and the gap between them. The 18 eligible studies were entered into study. The studies were conducted between 2004 and 2014. Based on the random effect model, the total mean score of students' perception, students' expectation and the gap between them were estimated 2.92 (95% CI, 2.75 - 3.09), 4.18 (95% CI, 3.98 - 4.38), respectively and -1.30 (95% CI= -1.56, -1.04). The studied students' expectation level is higher than the current quality of educational services. There is a tangible difference between their expectations and the current quality, which requires officials' efforts to improve quality in all dimensions and effective steps can be taken towards improving the quality of educational services through appropriate training planning and training for empowering employees in colleges and universities.
What Role Do We Expect Secondary Master Reading Teachers to Play?
ERIC Educational Resources Information Center
Savitz, Rachelle S.; Rasinski, Timothy
2018-01-01
In this article, we explore and identify the varied roles that have been assigned over time to the master reading teacher at the secondary level. Despite the fact that there are fewer master reading teachers (MRTs) at the secondary level, they are often required to take on even more responsibilities than MRTs at the elementary level. Secondary MRT…
Intertemporal consumption with directly measured welfare functions and subjective expectations
Kapteyn, Arie; Kleinjans, Kristin J.; van Soest, Arthur
2010-01-01
Euler equation estimation of intertemporal consumption models requires many, often unverifiable assumptions. These include assumptions on expectations and preferences. We aim at reducing some of these requirements by using direct subjective information on respondents’ preferences and expectations. The results suggest that individually measured welfare functions and expectations have predictive power for the variation in consumption across households. Furthermore, estimates of the intertemporal elasticity of substitution based on the estimated welfare functions are plausible and of a similar order of magnitude as other estimates found in the literature. The model favored by the data only requires cross-section data for estimation. PMID:20442798
EURADOS intercomparison on emergency radiobioassay.
Li, Chunsheng; Battisti, Paolo; Berard, Philippe; Cazoulat, Alain; Cuellar, Antonio; Cruz-Suarez, Rodolfo; Dai, Xiongxin; Giardina, Isabella; Hammond, Derek; Hernandez, Carolina; Kiser, Stephen; Ko, Raymond; Kramer-Tremblay, Sheila; Lecompte, Yannick; Navarro, Eva; Navas, Cristina; Sadi, Baki; Sierra, Inmaculada; Verrezen, Freddy; Lopez, Maria A
2015-12-01
Nine laboratories participated in an intercomparison exercise organised by the European Radiation Dosimetry Group (EURADOS) for emergency radiobioassay involving four high-risk radionuclides ((239)Pu, (241)Am, (90)Sr and (226)Ra). Diverse methods of analysis were used by the participating laboratories for the in vitro determination of each of the four radionuclides in urine samples. Almost all the methods used are sensitive enough to meet the requirements for emergency radiobioassay derived for this project in reference to the Clinical Decision Guide introduced by the NCRP. Results from most of the methods meet the requirements of ISO 28218 on accuracy in terms of relative bias and relative precision. However, some technical gaps have been identified. For example, some laboratories do not have the ability to assay samples containing (226)Ra, and sample turnaround time would be expected to be much shorter than that reported by many laboratories, as timely results for internal contamination and early decisions on medical intervention are highly desired. Participating laboratories are expected to learn from each other on the methods used to improve the interoperability among these laboratories. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Specificity and transfer effects in time production skill: examining the role of attention.
Wohldmann, Erica L; Healy, Alice F; Bourne, Lyle E
2012-05-01
Two experiments examined transfer of a prospective, time production skill under conditions involving changes in concurrent task requirements. Positive transfer of the time production skill might be expected only when the attentional demands of the concurrent task were held constant from training to test. However, some positive transfer was found even when the concurrent task at retraining was made either easier or more difficult than the concurrent task learned during training. The amount and direction of transfer depended more on the pacing of the stimuli in the secondary task than on the difficulty of the secondary task, even though difficulty affects attentional demands more. These findings are consistent with the procedural reinstatement principle of skill learning, by which transfer from one task to another depends on an overlap in procedures required by the two skills.
Patient Experiences of Recovery After Autologous Chondrocyte Implantation: A Qualitative Study
Toonstra, Jenny L.; Howell, Dana; English, Robert A.; Lattermann, Christian; Mattacola, Carl G.
2016-01-01
Context: The recovery process after autologous chondrocyte implantation (ACI) can be challenging for patients and clinicians alike due to significant functional limitations and a lengthy healing time. Understanding patients' experiences during the recovery process may assist clinicians in providing more individualized care. Objective: To explore and describe patients' experiences during the recovery process after ACI. Design: Qualitative study. Setting: Orthopaedic clinic. Patients or Other Participants: Participants from a single orthopaedic practice who had undergone ACI within the previous 12 months were purposefully selected. Data Collection and Analysis: Volunteers participated in 1-on-1 semistructured interviews to describe their recovery experiences after ACI. Data were analyzed using the process of horizontalization. Results: Seven patients (2 men, 5 women; age = 40.7 ± 7.5 years, time from surgery = 8.7 ± 4.2 months) participated. Four themes and 6 subthemes emerged from the data and suggested that the recovery process is a lengthy and emotional experience. Therapy provides optimism for the future but requires a collaborative effort among the patient, surgeon, rehabilitation provider, and patient's caregiver(s). Furthermore, patients expressed frustration that their expectations for recovery did not match the reality of the process, including greater dependence on caregivers than expected. Conclusions: Patients' expectations should be elicited before surgery and managed throughout the recovery process. Providing preoperative patient and caregiver education and encouraging preoperative rehabilitation can assist in managing expectations. Establishing realistic goals and expectations may improve rehabilitation adherence, encourage optimism for recovery, and improve outcomes in the long term. PMID:27835044
The timing of adoption of positron emission tomography: a real options approach.
Pertile, Paolo; Torri, Emanuele; Flor, Luciano; Tardivo, Stefano
2009-09-01
This paper presents the economic evaluation from a hospital's perspective of the investment in positron emission tomography, adopting a real options approach. The installation of this equipment requires a major capital outlay, while uncertainty on several key variables is substantial. The value of several timing strategies, including sequential investment, is determined taking into account that future decisions will be based on the information available at that time. The results show that adopting this approach may have an impact on the timing of investment, because postponing the investment may be optimal even when the Expected Net Present Value of the project is positive.
Assessing and Ensuring GOES-R Magnetometer Accuracy
NASA Technical Reports Server (NTRS)
Carter, Delano R.; Todirita, Monica; Kronenwetter, Jeffrey; Chu, Donald
2016-01-01
The GOES-R magnetometer subsystem accuracy requirement is 1.7 nanoteslas (nT). During quiet times (100 nT), accuracy is defined as absolute mean plus 3 sigma. During storms (300 nT), accuracy is defined as absolute mean plus 2 sigma. Error comes both from outside the magnetometers, e.g. spacecraft fields and misalignments, as well as inside, e.g. zero offset and scale factor errors. Because zero offset and scale factor drift over time, it will be necessary to perform annual calibration maneuvers. To predict performance before launch, we have used Monte Carlo simulations and covariance analysis. Both behave as expected, and their accuracy predictions agree within 30%. With the proposed calibration regimen, both suggest that the GOES-R magnetometer subsystem will meet its accuracy requirements.
Observational Model for Precision Astrometry with the Space Interferometry Mission
NASA Technical Reports Server (NTRS)
Turyshev, Slava G.; Milman, Mark H.
2000-01-01
The Space Interferometry Mission (SIM) is a space-based 10-m baseline Michelson optical interferometer operating in the visible waveband that is designed to achieve astrometric accuracy in the single digits of the microarcsecond domain. Over a narrow field of view SIM is expected to achieve a mission accuracy of 1 microarcsecond. In this mode SIM will search for planetary companions to nearby stars by detecting the astrometric "wobble" relative to a nearby reference star. In its wide-angle mode, SIM will provide 4 microarcsecond precision absolute position measurements of stars, with parallaxes to comparable accuracy, at the end of its 5-year mission. The expected proper motion accuracy is around 3 microarcsecond/year, corresponding to a transverse velocity of 10 m/ s at a distance of 1 kpc. The basic astrometric observable of the SIM instrument is the pathlength delay. This measurement is made by a combination of internal metrology measurements that determine the distance the starlight travels through the two arms of the interferometer, and a measurement of the white light stellar fringe to find the point of equal pathlength. Because this operation requires a non-negligible integration time, the interferometer baseline vector is not stationary over this time period, as its absolute length and orientation are time varying. This paper addresses how the time varying baseline can be "regularized" so that it may act as a single baseline vector for multiple stars, as required for the solution of the astrometric equations.
Functional Foods Baseline and Requirements Analysis
NASA Technical Reports Server (NTRS)
Cooper, M. R.; Bermudez-Aguirre, L. D.; Douglas, G.
2015-01-01
Current spaceflight foods were evaluated to determine if their nutrient profile supports positioning as a functional food and if the stability of the bioactive compound within the food matrix over an extended shelf-life correlated with the expected storage duration during the mission. Specifically, the research aims were: Aim A. To determine the amount of each nutrient in representative spaceflight foods immediately after processing and at predetermined storage time to establish the current nutritional state. Aim B. To identify the requirements to develop foods that stabilize these nutrients such that required concentrations are maintained in the space food system throughout long duration missions (up to five years). Aim C. To coordinate collaborations with health and performance groups that may require functional foods as a countermeasure.
Perceptual grouping effects on cursor movement expectations.
Dorneich, Michael C; Hamblin, Christopher J; Lancaster, Jeff A; Olofinboba, Olu
2014-05-01
Two studies were conducted to develop an understanding of factors that drive user expectations when navigating between discrete elements on a display via a limited degree-of-freedom cursor control device. For the Orion Crew Exploration Vehicle spacecraft, a free-floating cursor with a graphical user interface (GUI) would require an unachievable level of accuracy due to expected acceleration and vibration conditions during dynamic phases of flight. Therefore, Orion program proposed using a "caged" cursor to "jump" from one controllable element (node) on the GUI to another. However, nodes are not likely to be arranged on a rectilinear grid, and so movements between nodes are not obvious. Proximity between nodes, direction of nodes relative to each other, and context features may all contribute to user cursor movement expectations. In an initial study, we examined user expectations based on the nodes themselves. In a second study, we examined the effect of context features on user expectations. The studies established that perceptual grouping effects influence expectations to varying degrees. Based on these results, a simple rule set was developed to support users in building a straightforward mental model that closely matches their natural expectations for cursor movement. The results will help designers of display formats take advantage of the natural context-driven cursor movement expectations of users to reduce navigation errors, increase usability, and decrease access time. The rules set and guidelines tie theory to practice and can be applied in environments where vibration or acceleration are significant, including spacecraft, aircraft, and automobiles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-09-01
The final report is presented of a cost effective solar hot water heating system installed on the Econo-Travel Motor Hotel at 5408 Williamsburg Road, Richmond, Virginia. The description of the system is given along with the final cost breakdown, expected performance data and expected payback time for the installed system is estimated to be approximately five (5) years instead of the 6.65 years estimated for the proposal. The additional savings is due to the reduction in the peak demand charge since the electric hot water heaters are not required to operate at the same time each morning as the dryersmore » used for the laundry. The success of the system will be determined by the reduction in the utility cost and reduced use of our fossil fuels. The results shown in the hotel's monthly electricity bills indicate that this goal has been accomplished.« less
User data dissemination concepts for earth resources
NASA Technical Reports Server (NTRS)
Davies, R.; Scott, M.; Mitchell, C.; Torbett, A.
1976-01-01
Domestic data dissemination networks for earth-resources data in the 1985-1995 time frame were evaluated. The following topics were addressed: (1) earth-resources data sources and expected data volumes, (2) future user demand in terms of data volume and timeliness, (3) space-to-space and earth point-to-point transmission link requirements and implementation, (4) preprocessing requirements and implementation, (5) network costs, and (6) technological development to support this implementation. This study was parametric in that the data input (supply) was varied by a factor of about fifteen while the user request (demand) was varied by a factor of about nineteen. Correspondingly, the time from observation to delivery to the user was varied. This parametric evaluation was performed by a computer simulation that was based on network alternatives and resulted in preliminary transmission and preprocessing requirements. The earth-resource data sources considered were: shuttle sorties, synchronous satellites (e.g., SEOS), aircraft, and satellites in polar orbits.
Deswelling kinetics of polyacrylate gels in solutions of cetyltrimethylammonium bromide.
Nilsson, Peter; Hansson, Per
2007-08-23
The deswelling kinetics of single sodium polyacrylate gel beads (radius 40-160 microm) in aqueous solutions of cetyltrimethylammonium bromide under conditions of forced convection are investigated using micromanipulator assisted light microscopy. The purpose of the study is to further evaluate a previously published model (J. Phys. Chem. B 2003, 107, 9203) using a higher homolog surfactant. For gels with expected fast deswelling (small gel size/low surfactant concentration) and/or in low electrolyte concentration, the model is found to correctly predict the deswelling characteristics of the gel beads. However, for some gels with expected slow deswelling, especially in high electrolyte concentration (10 mM NaBr), the model widely underestimates the required deswelling time. The reason for this is argued to be the longer time frame and high bromide concentration allowing the formation of a denser, more ordered structure in the surface phase, which resists the deformation and reorganization of material necessary for deswelling. Unexpectedly long lag times before the start of deswelling are also found for gels in low surfactant concentration, indicating that a relatively high surfactant concentration in the gel, greatly exceeding the critical aggregation concentration, is needed to start formation of a collapsed surface phase. This critical surfactant concentration is found to be dependent on initial gel radius, as small gels require a relatively higher concentration to initiate collapse.
Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca
2016-10-15
The Expected Value of Perfect Partial Information (EVPPI) is a decision-theoretic measure of the 'cost' of parametric uncertainty in decision making used principally in health economic decision making. Despite this decision-theoretic grounding, the uptake of EVPPI calculations in practice has been slow. This is in part due to the prohibitive computational time required to estimate the EVPPI via Monte Carlo simulations. However, recent developments have demonstrated that the EVPPI can be estimated by non-parametric regression methods, which have significantly decreased the computation time required to approximate the EVPPI. Under certain circumstances, high-dimensional Gaussian Process (GP) regression is suggested, but this can still be prohibitively expensive. Applying fast computation methods developed in spatial statistics using Integrated Nested Laplace Approximations (INLA) and projecting from a high-dimensional into a low-dimensional input space allows us to decrease the computation time for fitting these high-dimensional GP, often substantially. We demonstrate that the EVPPI calculated using our method for GP regression is in line with the standard GP regression method and that despite the apparent methodological complexity of this new method, R functions are available in the package BCEA to implement it simply and efficiently. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Miller, Timothy L; Jamieson, Marissa; Everson, Sonsecharae; Siegel, Courtney
2017-12-01
Few studies have documented expected time to return to athletic participation after stress fractures in elite athletes. Time to return to athletic participation after stress fractures would vary by site and severity of stress fracture. Retrospective cohort study. Level 3. All stress fractures diagnosed in a single Division I collegiate men's and women's track and field/cross-country team were recorded over a 3-year period. Site and severity of injury were graded based on Kaeding-Miller classification system for stress fractures. Time to return to full unrestricted athletic participation was recorded for each athlete and correlated with patient sex and site and severity grade of injury. Fifty-seven stress fractures were diagnosed in 38 athletes (mean age, 20.48 years; range, 18-23 years). Ten athletes sustained recurrent or multiple stress fractures. Thirty-seven injuries occurred in women and 20 in men. Thirty-three stress fractures occurred in the tibia, 10 occurred in the second through fourth metatarsals, 3 occurred in the fifth metatarsal, 6 in the tarsal bones (2 navicular), 2 in the femur, and 5 in the pelvis. There were 31 grade II stress fractures, 11 grade III stress fractures, and 2 grade V stress fractures (in the same patient). Mean time to return to unrestricted sport participation was 12.9 ± 5.2 weeks (range, 6-27 weeks). No significant differences in time to return were noted based on injury location or whether stress fracture was grade II or III. The expected time to return to full unrestricted athletic participation after diagnosis of a stress fracture is 12 to 13 weeks for all injury sites. Athletes with grade V (nonunion) stress fractures may require more time to return to sport.
Preston, Todd M.; Kim, Kevin
2016-01-01
The Williston Basin in the Northern Great Plains has experienced rapid energy development since 2000. To evaluate the land cover changes resulting from recent (2000 – 2015) development, the area and previous land cover of all well pads (pads) constructed during this time was determined, the amount of disturbed and reclaimed land adjacent to pads was estimated, land cover changes were analyzed over time for three different well types, and the effects from future development were predicted. The previous land cover of the 12,990 ha converted to pads was predominately agricultural (49.5%) or prairie (47.4%) with lesser amounts of developed (2.3%), aquatic (0.5%), and forest (0.4%). Additionally, 12,121 ha have likely been disturbed and reclaimed. The area required per gas well remained constant through time while the land required per oil well increased initially and then decreased as development first shifted from conventional to unconventional drilling and then to multi-bore pads. For non-oil-and- gas wells (i.e. stratigraphic test wells, water wells, injection wells, etc.), the area per well increased through time likely due to increased produced water disposal requirements. Future land cover change is expected to be 2.7 times greater than recent development with much of the development occurring in five counties in the core Bakken development area. Direct land cover change and disturbance from recent and expected development are predicted to affect 0.4% of the landscape across the basin; however, in the core Bakken development area, 2.3% of the landscape will be affected including 2.1% of the remaining grassland. Although future development will result in significant land cover change, evolving industry practices and proactive siting decisions, such as development along energy corridors and placing pads in areas previously altered by human activity, have the potential to reduce the ecological effects of future energy development in the Williston Basin.
Final S020 Skylab experiment report
NASA Technical Reports Server (NTRS)
Tousey, R.; Garrett, D. L.
1975-01-01
After the loss of the meteroid shield required using the solar scientific airlock to erect the sun shade, methods were improvised to operate the S020 experiment on EVA's. Almost no data was obtained in the wavelength range 10 to 110 A. From 110 to 280 A the spectra were 10 to 100 time less intense than expected. A probable cause in loss of instrument sensitivity is the contamination of the filters by the spacecraft coolant. A list of observed lines in presented. Although less data was obtained than expected, several lines not previously observed were recorded; and the spectra serve to confirm many very faintly observed weak lines recorded from sounding rockets by other experiments.
ERIC Educational Resources Information Center
Department of Education, Washington, DC. Office of the Secretary.
New guidelines to provide more flexibility and certainty in meeting federal time distribution recordkeeping requirements for U.S. Department of Education programs are outlined in this document. Directed to state, local and Indian tribal governments, the new guidelines are expected to avoid audit appeals and disputes, freeing teachers and…
ERIC Educational Resources Information Center
Leach, Mozelle P.
A study compared the results of a 1986 survey on the opinions of preservice teachers with those of a survey conducted in 1984 by William Bennett (at that time chairman of the National Endowment for the Humanities). Bennett's survey asked respondents to list 30 works that students should be expected to read before high school graduation.…
Design Evaluation of High Reliability Lithium Batteries
NASA Technical Reports Server (NTRS)
Buchman, R. C.; Helgeson, W. D.; Istephanous, N. S.
1985-01-01
Within one year, a lithium battery design can be qualified for device use through the application of accelerated discharge testing, calorimetry measurements, real time tests and other supplemental testing. Materials and corrosion testing verify that the battery components remain functional during expected battery life. By combining these various methods, a high reliability lithium battery can be manufactured for applications which require zero defect battery performance.
14 CFR 121.646 - En-route fuel supply: flag and supplemental operations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... supply requirements of § 121.333; and (iii) Considering expected wind and other weather conditions. (3..., considering wind and other weather conditions expected, it has the fuel otherwise required by this part and... errors in wind forecasting. In calculating the amount of fuel required by paragraph (b)(1)(i) of this...
14 CFR 121.646 - En-route fuel supply: flag and supplemental operations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... supply requirements of § 121.333; and (iii) Considering expected wind and other weather conditions. (3..., considering wind and other weather conditions expected, it has the fuel otherwise required by this part and... errors in wind forecasting. In calculating the amount of fuel required by paragraph (b)(1)(i) of this...
14 CFR 121.646 - En-route fuel supply: flag and supplemental operations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... supply requirements of § 121.333; and (iii) Considering expected wind and other weather conditions. (3..., considering wind and other weather conditions expected, it has the fuel otherwise required by this part and... errors in wind forecasting. In calculating the amount of fuel required by paragraph (b)(1)(i) of this...
77 FR 60114 - Agency Information Collection Activities Under OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-02
... approximately 100 entities on a daily basis. The recordkeeping requirement of section 22.5 is expected to apply to approximately 100 entities on an approximately annual basis. Based on experience with analogous... required by section 22.2(g) is expected to require about 100 hours annually per entity, for a total burden...
NASA Technical Reports Server (NTRS)
Willis, E. A.
1982-01-01
An update on general aviation (g/a) and commuter aircraft propulsion research effort is reviewed. The following topics are discussed: on several advanced intermittent combustion engines emphasizing lightweight diesels and rotary stratified charge engines. The current state-of-the-art is evaluated for lightweight, aircraft suitable versions of each engine. This information is used to project the engine characteristics that can be expected on near-term and long-term time horizons. The key enabling technology requirements are identified for each engine on the long-term time horizon.
Progress on the upgrade of the CMS Hadron Calorimeter Front-End electronics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Jake; Whitmore, Juliana; /Fermilab
2011-11-01
We present a scheme to upgrade the CMS HCAL front-end electronics in the second long shutdown to upgrade the LHC (LS2), which is expected to occur around 2018. The HCAL electronics upgrade is required to handle the major instantaneous luminosity increase (up to 5 * 10{sup 34} cm{sup -2} s{sup -1}) and an expected integrated luminosity of {approx}3000 fb{sup -1}. A key aspect of the HCAL upgrade is to read out longitudinal segmentation information to improve background rejection, energy resolution, and electron isolation at the L1 trigger. This paper focuses on the requirements for the new electronics and on themore » proposed solutions. The requirements include increased channel count, additional timing capabilities, and additional redundancy. The electronics are required to operate in a harsh environment and are constrained by the existing infrastructure. The proposed solutions span from chip level to system level. They include the development of a new ASIC ADC, the design and testing of higher speed transmitters to handle the increased data volume, the evaluation and use of circuits from other developments, evaluation of commercial FPGAs, better thermal design, and improvements in the overall readout architecture. We will report on the progress of the designs for these upgraded systems, along with performance requirements and initial design studies.« less
Chen, He-Guei; Chiang, Hui-Hua Kenny; Lee, Oscar Kuang-Sheng
2013-01-01
Mesenchymal stromal cells (MSCs) hold great potential in skeletal tissue engineering and regenerative medicine. However, conventional methods that are used in molecular biology to evaluate osteogenic differentiation of MSCs require a relatively large amount of cells. Cell lysis and cell fixation are also required and all these steps are time-consuming. Therefore, it is imperative to develop a facile technique which can provide real-time information with high sensitivity and selectivity to detect the osteogenic maturation of MSCs. In this study, we use Raman spectroscopy as a biosensor to monitor the production of mineralized matrices during osteogenic induction of MSCs. In summary, Raman spectroscopy is an excellent biosensor to detect the extent of maturation level during MSCs-osteoblast differentiation with a non-disruptive, real-time and label free manner. We expect that this study will promote further investigation of stem cell research and clinical applications. PMID:23734254
Application of statistical process control to qualitative molecular diagnostic assays.
O'Brien, Cathal P; Finn, Stephen P
2014-01-01
Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.
Attribution of soil information associated with modeling background clutter
NASA Astrophysics Data System (ADS)
Mason, George; Melloh, Rae
2006-05-01
This paper examines the attribution of data fields required to generate high resolution soil profiles for support of Computational Test Bed (CTB) used for countermine research. The countermine computational test bed is designed to realistically simulate the geo-environment to support the evaluation of sensors used to locate unexploded ordnance. The goal of the CTB is to derive expected moisture, chemical compounds, and measure heat migration over time, from which we expect to optimize sensor performance. Several tests areas were considered for the collection of soils data to populate the CTB. Collection of bulk soil properties has inherent spatial resolution limits. Novel techniques are therefore required to populate a high resolution model. This paper presents correlations between spatial variability in texture as related to hydraulic permeability and heat transfer properties of the soil. The extracted physical properties are used to exercise models providing a signature of subsurface media and support the simulation of detection by various sensors of buried and surface ordnance.
NASA Technical Reports Server (NTRS)
Bernstein, W.
1981-01-01
The possible use of Chamber A for the replication or simulation of space plasma physics processes which occur in the geosynchronous Earth orbit (GEO) environment is considered. It is shown that replication is not possible and that scaling of the environmental conditions is required for study of the important instability processes. Rules for such experimental scaling are given. At the present time, it does not appear technologically feasible to satisfy these requirements in Chamber A. It is, however, possible to study and qualitatively evaluate the problem of vehicle charging at GEO. In particular, Chamber A is sufficiently large that a complete operational spacecraft could be irradiated by beams and charged to high potentials. Such testing would contribute to the assessment of the operational malfunctions expected at GEO and their possible correction. However, because of the many tabulated limitations in such a testing programs, its direct relevance to conditions expected in the geo environment remains questionable.
Cognitive Load Does Not Affect the Behavioral and Cognitive Foundations of Social Cooperation.
Mieth, Laura; Bell, Raoul; Buchner, Axel
2016-01-01
The present study serves to test whether the cognitive mechanisms underlying social cooperation are affected by cognitive load. Participants interacted with trustworthy-looking and untrustworthy-looking partners in a sequential Prisoner's Dilemma Game. Facial trustworthiness was manipulated to stimulate expectations about the future behavior of the partners which were either violated or confirmed by the partners' cheating or cooperation during the game. In a source memory test, participants were required to recognize the partners and to classify them as cheaters or cooperators. A multinomial model was used to disentangle item memory, source memory and guessing processes. We found an expectancy-congruent bias toward guessing that trustworthy-looking partners were more likely to be associated with cooperation than untrustworthy-looking partners. Source memory was enhanced for cheating that violated the participants' positive expectations about trustworthy-looking partners. We were interested in whether or not this expectancy-violation effect-that helps to revise unjustified expectations about trustworthy-looking partners-depends on cognitive load induced via a secondary continuous reaction time task. Although this secondary task interfered with working memory processes in a validation study, both the expectancy-congruent guessing bias as well as the expectancy-violation effect were obtained with and without cognitive load. These findings support the hypothesis that the expectancy-violation effect is due to a simple mechanism that does not rely on demanding elaborative processes. We conclude that most cognitive mechanisms underlying social cooperation presumably operate automatically so that they remain unaffected by cognitive load.
Cognitive Load Does Not Affect the Behavioral and Cognitive Foundations of Social Cooperation
Mieth, Laura; Bell, Raoul; Buchner, Axel
2016-01-01
The present study serves to test whether the cognitive mechanisms underlying social cooperation are affected by cognitive load. Participants interacted with trustworthy-looking and untrustworthy-looking partners in a sequential Prisoner’s Dilemma Game. Facial trustworthiness was manipulated to stimulate expectations about the future behavior of the partners which were either violated or confirmed by the partners’ cheating or cooperation during the game. In a source memory test, participants were required to recognize the partners and to classify them as cheaters or cooperators. A multinomial model was used to disentangle item memory, source memory and guessing processes. We found an expectancy-congruent bias toward guessing that trustworthy-looking partners were more likely to be associated with cooperation than untrustworthy-looking partners. Source memory was enhanced for cheating that violated the participants’ positive expectations about trustworthy-looking partners. We were interested in whether or not this expectancy-violation effect—that helps to revise unjustified expectations about trustworthy-looking partners—depends on cognitive load induced via a secondary continuous reaction time task. Although this secondary task interfered with working memory processes in a validation study, both the expectancy-congruent guessing bias as well as the expectancy-violation effect were obtained with and without cognitive load. These findings support the hypothesis that the expectancy-violation effect is due to a simple mechanism that does not rely on demanding elaborative processes. We conclude that most cognitive mechanisms underlying social cooperation presumably operate automatically so that they remain unaffected by cognitive load. PMID:27630597
Impact of a in situ laboratory on physician expectancy.
Brulé, Romain; Sarazin, Marianne; Tayeb, Nicole; Roubille, Martine; Szymanowicz, Anton
2018-01-01
Biological examinations are essential for clinicians' medical care. The aim of this study is to assess clinicians' expectations in healthcare facilities and their perception of medical biology in different types of organization. We performed a prospective transversal study by electronic questionnaire conducted among 242 practitioners in four healthcare facilities. The aspects explored were as follows: quality, reliability, rendering time of examination results and biology platform support. Analyses were conducted after rectification of the sample by weight. Sixty one clinicians responded (25.2% [19.7-30.7]). The rendering time of examination is the main criterion mentioned with a requirement of less than one hour in case of emergency (81.5% [71.8-91.2] of the answers) to less than 72 hours for specialized examinations (81.5% [71.8-91.2] of the answers). Better collaboration with biologists is expected by clinicians (54.7% [50.9-58.5]). Satisfaction with the biology platform support and rendering time of emergency cases results was significantly (p <0.005) lower in facilities without an on-site laboratory. In conclusion, although medical biology performance is generally satisfactory within medical facilities, it remains nonetheless affected when the laboratory is not on site. The rendering time of examination, depending on the biology platform support functions and the proximity of the laboratory, remains the main criterion. Clinician-biologist collaboration, which increases of the medico-economic efficiency of patient's healthcare, appears as an essential criterion in a structural conception of medical biology.
Long-term persistence of solar activity. [Abstract only
NASA Technical Reports Server (NTRS)
Ruzmaikin, Alexander; Feynman, Joan; Robinson, Paul
1994-01-01
The solar irradiance has been found to change by 0.1% over the recent solar cycle. A change of irradiance of about 0.5% is required to effect the Earth's climate. How frequently can a variation of this size be expected? We examine the question of the persistence of non-periodic variations in solar activity. The Huerst exponent, which characterizes the persistence of a time series (Mandelbrot and Wallis, 1969), is evaluated for the series of C-14 data for the time interval from about 6000 BC to 1950 AD (Stuiver and Pearson, 1986). We find a constant Huerst exponent, suggesting that solar activity in the frequency range of from 100 to 3000 years includes an important continuum component in addition to the well-known periodic variations. The value we calculate, H approximately equal to 0.8, is significantly larger than the value of 0.5 that would correspond to variations produced by a white-noise process. This value is in good agreement with the results for the monthly sunspot data reported elsewhere, indicating that the physics that produces the continuum is a correlated random process (Ruzmaikin et al., 1992), and that is is the same type of process over a wide range of time interval lengths. We conclude that the time period over which an irradiance change of 0.5% can be expected to occur is significantly shorter than that which would be expected for variations produced by a white-noise process.
Schmid, R; Spiessl, H; Vukovich, A; Cording, C
2003-03-01
This article aimed to provide an overview regarding the burden of relatives of mentally ill patients and their expectations towards psychiatric institutions. The literature was selected from Medline covering the years 1996 - 2002. 342 articles were reviewed, 145 of which were described in this review. The burden of relatives are manifold and can be classified into the following categories: time spent on caring, financial difficulties, occupational restrictions, detrimental effects on relatives own physical and psychological well-being, reduction in leisure activities, negative effects on social relationships, experiences of discrimination and refusal, deficiencies in information about illness, feelings of not being taken seriously, insufficient service support, long distance to mental health service, emotional burdens of caregivers and difficulties with the patients behaviour. The expectations of the relatives mainly refer to the categories "relationship between staff and relatives", "information about illness" and "establishing of institutions required". The various burdens of relatives and their expectations towards psychiatric services point to necessary improvements of mental health services in the sense of a consumer-oriented psychiatric care.
Hogg, William; Kendall, Claire; Muggah, Elizabeth; Mayo-Bruinsma, Liesha; Ziebell, Laura
2014-02-01
A key priority in primary health care research is determining how to ensure the advancement of new family physician clinician investigators (FP-CIs). However, there is little consensus on what expectations should be implemented for new investigators to ensure the successful and timely acquisition of independent salary support. Support new FP-CIs to maximize early career research success. This program description aims to summarize the administrative and financial support provided by the C.T. Lamont Primary Health Care Research Centre in Ottawa, Ont, to early career FP-CIs; delineate career expectations; and describe the results in terms of research productivity on the part of new FP-CIs. Family physician CI's achieved a high level of research productivity during their first 5 years, but most did not secure external salary support. It might be unrealistic to expect new FP-CIs to be self-financing by the end of 5 years. This is a career-development program, and supporting new career FP-CIs requires a long-term investment. This understanding is critical to fostering and strengthening sustainable primary care research programs.
Motivational Deficits in Schizophrenia and the Representation of Expected Value
Waltz, James A.; Gold, James M.
2016-01-01
Motivational deficits (avolition and anhedonia) have historically been considered important negative symptoms of schizophrenia. Numerous studies have attempted to identify the neural substrates of avolition and anhedonia in schizophrenia, but these studies have not produced much agreement. Deficits in various aspects of reinforcement processing have been observed in individuals with schizophrenia, but it is not exactly clear which of these deficits actually engender motivational impairments in SZ. The purpose of this chapter is to examine how various reinforcement-related behavioral and neural signals could contribute to motivational impairments in both schizophrenia, and psychiatric illness, in general. In particular, we describe different aspects of the concept of expected value (EV), such as the distinction between the EV of stimuli and the expected value of actions, the acquisition of value vs. the estimation of value, and the discounting of value as a consequence of time or effort required. We conclude that avolition and anhedonia in SZ are most commonly tied to aberrant signals for expected value, in the context of learning. We discuss implications for further research on the neural substrates of motivational impairments in psychiatric illness. PMID:26370946
Observing strategies for future solar facilities: the ATST test case
NASA Astrophysics Data System (ADS)
Uitenbroek, H.; Tritschler, A.
2012-12-01
Traditionally solar observations have been scheduled and performed very differently from night time efforts, in particular because we have been observing the Sun for a long time, requiring new combinations of observables to make progress, and because solar physics observations are often event driven on time scales of hours to days. With the proposal pressure that is expected for new large-aperture facilities, we can no longer afford the time spent on custom setups, and will have to rethink our scheduling and operations. We will discuss our efforts at Sac Peak in preparing for this new era, and outline the planned scheduling and operations planning for the ATST in particular.
Miller, Grant; Urdinola, B. Piedad
2011-01-01
Recent studies demonstrate procyclical mortality in wealthy countries, but there are reasons to expect a countercyclical relationship in developing nations. We investigate how child survival in Colombia responds to fluctuations in world Arabica coffee prices – and document starkly procyclical child deaths. In studying this result’s behavioral underpinnings, we highlight that: (1) The leading determinants of child health are inexpensive but require considerable time, and (2) As the value of time declines with falling coffee prices, so does the relative price of health. We find a variety of direct evidence consistent with the primacy of time in child health production. PMID:22090662
Cano, Miguel Ángel; Lam, Cho Y; Chen, Minxing; Adams, Claire E; Correa-Fernández, Virmarie; Stewart, Diana W; McClure, Jennifer B; Cinciripini, Paul M; Wetter, David W
2014-08-01
Ecological momentary assessment was used to examine associations between negative affect, positive smoking outcome expectancies, and smoking urge during the first 7 days of a smoking quit attempt. Participants were 302 female smokers who enrolled in an individually tailored smoking cessation treatment study. Multilevel mediation analysis was used to examine the temporal relationship among the following: (a) the effects of negative affect and positive smoking outcome expectancies at 1 assessment point (e.g., time j) on smoking urge at the subsequent time point (e.g., time j + 1) in Model 1; and, (b) the effects of negative affect and smoking urge at time j on positive smoking outcome expectancies at time j + 1 in Model 2. The results from Model 1 showed a statistically significant effect of negative affect at time j on smoking urge at time j + 1, and this effect was mediated by positive smoking outcome expectancies at time j, both within- and between-participants. In Model 2, the within-participant indirect effect of negative affect at time j on positive smoking outcome expectancies at time j + 1 through smoking urge at time j was nonsignificant. However, a statistically significant indirect between-participants effect was found in Model 2. The findings support the hypothesis that urge and positive smoking outcome expectancies increase as a function of negative affect, and suggest a stronger effect of expectancies on urge as opposed to the effect of urge on expectancies.
Cano, Miguel Ángel; Lam, Cho Y.; Chen, Minxing; Adams, Claire E.; Correa-Fernández, Virmarie; Stewart, Diana W.; McClure, Jennifer B.; Cinciripini, Paul M.; Wetter, David W.
2014-01-01
Ecological momentary assessment was used to examine associations between negative affect, positive smoking outcome expectancies, and smoking urge during the first 7 days of a smoking quit attempt. Participants were 302 female smokers who enrolled in an individually tailored smoking cessation treatment study. Multilevel mediation analysis was used to examine the temporal relationship among: 1) the effects of negative affect and positive smoking outcome expectancies at one assessment point (e.g., time j) on smoking urge at the subsequent time point (e.g., time j + 1) in Model 1; and, 2) the effects of negative affect and smoking urge at time j on positive smoking outcome expectancies at time j + 1 in Model 2. The results from Model 1 showed a statistically significant effect of negative affect at time j on smoking urge at time j + 1, and this effect was mediated by positive smoking outcome expectancies at time j, both within- and between-participant. In Model 2, the within-participant indirect effect of negative affect at time j on positive smoking outcome expectancies at time j + 1 through smoking urge at time j was nonsignificant. However, a statistically significant indirect between-participant effect was found in Model 2. The findings support the hypothesis that urge and positive smoking outcome expectancies increase as a function of negative affect, and suggest a stronger effect of expectancies on urge as opposed to the effect of urge on expectancies. PMID:24796849
ERIC Educational Resources Information Center
DeCarlo, Jeffrey
2010-01-01
Air travel is expected to grow by a factor of 2 to 3 times by 2025 and people working in the aviation system, including airport personnel, pilots, and air traffic controllers, must be able to safely and efficiently operate in this arena ("NextGen"). In response to the personnel training and education requirements concomitant with "NextGen,"…
Block 4 solar cell module design and test specification for intermediate load center applications
NASA Technical Reports Server (NTRS)
1978-01-01
Requirements for performance of terrestrial solar cell modules intended for use in various test applications are established. During the 1979-80 time period, such applications are expected to be in the 20 to 500 kilowatt size range. A series of characterization and qualification tests necessary to certify the module design for production, and the necessary performance test for acceptance of modules are specified.
Implementation for GREAT I Study.
1981-06-01
population centers where greater beneficial use is expected. ’Most of the sites selected are not owned by the Federal Government. Many factors could prevent ...have to be five to six times the dredged material volume. Berming is generally required to prevent encroachment beyond the placement site. The site...fish and wildlife and/ or recreation under its cost-sharing authorities (Public Law 89-72 and Code 710 Prgram ), including facilitating the
Physicians' perceptions of physician-nurse interactions and information needs in China.
Wen, Dong; Guan, Pengcheng; Zhang, Xingting; Lei, Jianbo
2018-01-01
Good communication between physicians and nurses is important for the understanding of disease status and treatment feedback; however, certain issues in Chinese hospitals could lead to suboptimal physician-nurse communication in clinical work. Convenience sampling was used to recruit participants. Questionnaires were sent to clinical physicians in three top tertiary Grade-A teaching hospitals in China and six hundred and seventeen physicians participated in the survey. (1) Common physician-nurse interactions were shift-change reports and provisional reports when needed, and interactions expected by physicians included face-to-face reports and communication via a phone or mobile device. (2) Most respondents believed that the need for information in physician-nurse interactions was high, information was moderately accurate and timely, and feedback regarding interaction time and satisfaction indicated that they were only average and required improvement. (3) Information needs in physician-nurse interactions differed significantly according to hospital category, role, workplace, and educational background (p < .05). There was a considerable need for information within physician-nurse interactions, and the level of satisfaction with the information obtained was average; requirements for the improvement of communication differed between physicians and nurses because of differences in their characteristics. Currently, the use of information technology in physician-nurse communication was less common but was highly expected by physicians.
Numerical investigations in three-dimensional internal flows
NASA Astrophysics Data System (ADS)
Rose, William C.
1988-08-01
An investigation into the use of computational fluid dynamics (CFD) was performed to examine the expected heat transfer rates that will occur within the NASA-Ames 100 megawatt arc heater nozzle. This nozzle was tentatively designed and identified to provide research for a directly connected combustion experiment specifically related to the National Aerospace Plane Program (NASP) aircraft, and is expected to simulate the flow field entering the combustor section. It was found that extremely fine grids, that is very small mesh spacing near the wall, are required to accurately model the heat transfer process and, in fact, must contain a point within the laminar sublayer if results are to be taken directly from a numerical simulation code. In the present study, an alternative to this very fine mesh and its attendant increase in computational time was invoked and is based on a wall-function method. It was shown that solutions could be obtained that give accurate indications of surface heat transfer rate throughout the nozzle in approximately 1/100 of the computer time required to do the simulation directly without the use of the wall-function implementation. Finally, a maximum heating value in the throat region of the proposed slit nozzle for the 100 megawatt arc heater was shown to be approximately 6 MW per square meter.
Continued Data Acquisition Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwellenbach, David
This task focused on improving techniques for integrating data acquisition of secondary particles correlated in time with detected cosmic-ray muons. Scintillation detectors with Pulse Shape Discrimination (PSD) capability show the most promise as a detector technology based on work in FY13. Typically PSD parameters are determined prior to an experiment and the results are based on these parameters. By saving data in list mode, including the fully digitized waveform, any experiment can effectively be replayed to adjust PSD and other parameters for the best data capture. List mode requires time synchronization of two independent data acquisitions (DAQ) systems: the muonmore » tracker and the particle detector system. Techniques to synchronize these systems were studied. Two basic techniques were identified: real time mode and sequential mode. Real time mode is the preferred system but has proven to be a significant challenge since two FPGA systems with different clocking parameters must be synchronized. Sequential processing is expected to work with virtually any DAQ but requires more post processing to extract the data.« less
Guidance concepts for time-based flight operations
NASA Technical Reports Server (NTRS)
Vicroy, Dan D.
1990-01-01
Airport congestion and the associated delays are severe in today's airspace system and are expected to increase. NASA and the FAA is investigating various methods of alleviating this problem through new technology and operational procedures. One concept for improving airspace productivity is time-based control of aircraft. Research to date has focused primarily on the development of time-based flight management systems and Air Traffic Control operational procedures. Flight operations may, however, require special onboard guidance in order to satisfy the Air Traffic Control imposed time constraints. The results are presented of a simulation study aimed at evaluating several time-based guidance concepts in terms of tracking performance, pilot workload, and subjective preference. The guidance concepts tested varied in complexity from simple digital time-error feedback to an advanced time-referenced-energy guidance scheme.
Global patterns of drought recovery.
Schwalm, Christopher R; Anderegg, William R L; Michalak, Anna M; Fisher, Joshua B; Biondi, Franco; Koch, George; Litvak, Marcy; Ogle, Kiona; Shaw, John D; Wolf, Adam; Huntzinger, Deborah N; Schaefer, Kevin; Cook, Robert; Wei, Yaxing; Fang, Yuanyuan; Hayes, Daniel; Huang, Maoyi; Jain, Atul; Tian, Hanqin
2017-08-09
Drought, a recurring phenomenon with major impacts on both human and natural systems, is the most widespread climatic extreme that negatively affects the land carbon sink. Although twentieth-century trends in drought regimes are ambiguous, across many regions more frequent and severe droughts are expected in the twenty-first century. Recovery time-how long an ecosystem requires to revert to its pre-drought functional state-is a critical metric of drought impact. Yet the factors influencing drought recovery and its spatiotemporal patterns at the global scale are largely unknown. Here we analyse three independent datasets of gross primary productivity and show that, across diverse ecosystems, drought recovery times are strongly associated with climate and carbon cycle dynamics, with biodiversity and CO 2 fertilization as secondary factors. Our analysis also provides two key insights into the spatiotemporal patterns of drought recovery time: first, that recovery is longest in the tropics and high northern latitudes (both vulnerable areas of Earth's climate system) and second, that drought impacts (assessed using the area of ecosystems actively recovering and time to recovery) have increased over the twentieth century. If droughts become more frequent, as expected, the time between droughts may become shorter than drought recovery time, leading to permanently damaged ecosystems and widespread degradation of the land carbon sink.
Design considerations for computationally constrained two-way real-time video communication
NASA Astrophysics Data System (ADS)
Bivolarski, Lazar M.; Saunders, Steven E.; Ralston, John D.
2009-08-01
Today's video codecs have evolved primarily to meet the requirements of the motion picture and broadcast industries, where high-complexity studio encoding can be utilized to create highly-compressed master copies that are then broadcast one-way for playback using less-expensive, lower-complexity consumer devices for decoding and playback. Related standards activities have largely ignored the computational complexity and bandwidth constraints of wireless or Internet based real-time video communications using devices such as cell phones or webcams. Telecommunications industry efforts to develop and standardize video codecs for applications such as video telephony and video conferencing have not yielded image size, quality, and frame-rate performance that match today's consumer expectations and market requirements for Internet and mobile video services. This paper reviews the constraints and the corresponding video codec requirements imposed by real-time, 2-way mobile video applications. Several promising elements of a new mobile video codec architecture are identified, and more comprehensive computational complexity metrics and video quality metrics are proposed in order to support the design, testing, and standardization of these new mobile video codecs.
Gates, Timothy J; Noyce, David A
2016-11-01
This manuscript describes the development and evaluation of a conceptual framework for real-time operation of dynamic on-demand extension of the red clearance interval as a countermeasure for red-light-running. The framework includes a decision process for determining, based on the real-time status of vehicles arriving at the intersection, when extension of the red clearance interval should occur and the duration of each extension. A zonal classification scheme was devised to assess whether an approaching vehicle requires additional time to safely clear the intersection based on the remaining phase time, type of vehicle, current speed, and current distance from the intersection. Expected performance of the conceptual framework was evaluated through modeling of replicated field operations using vehicular event data collected as part of this research. The results showed highly accurate classification of red-light-running vehicles needing additional clearance time and relatively few false extension calls from stopping vehicles, thereby minimizing the expected impacts to signal and traffic operations. Based on the recommended parameters, extension calls were predicted to occur once every 26.5cycles. Assuming a 90scycle, 1.5 extensions per hour were expected per approach, with an estimated extension time of 2.30s/h. Although field implementation was not performed, it is anticipated that long-term reductions in targeted red-light-running conflicts and crashes will likely occur if red clearance interval extension systems are implemented at locations where start-up delay on the conflicting approach is generally minimal, such as intersections with lag left-turn phasing. Copyright © 2015 Elsevier Ltd. All rights reserved.
Vera, Juan F.; Brenner, Lara J.; Gerdemann, Ulrike; Ngo, Minhtran C.; Sili, Uluhan; Liu, Hao; Wilson, John; Dotti, Gianpietro; Heslop, Helen E.; Leen, Ann M.; Rooney, Cliona M.
2009-01-01
The clinical manufacture of antigen-specific cytotoxic T lymphocytes (CTL) for adoptive immunotherapy is limited by the complexity and time required to produce large numbers with the desired function and specificity. The culture conditions required are rigorous, and in some cases only achieved in 2cm2 wells in which cell growth is limited by gas exchange, nutrients and waste accumulation. Bioreactors developed to overcome these issues tend to be complex, expensive and not always conducive to CTL growth. We observed that antigen-specific CTL undergo seven to ten divisions post-stimulation. However the expected CTL numbers were achieved only in the first week of culture. By recreating the culture conditions present during this first week - low frequency of antigen-specific T-cells and high frequency of feeder cells - we were able to increase CTL expansion to expected levels which could be sustained for several weeks without affecting phenotype or function. However, the number of 24-well plates needed was excessive and cultures required frequent media changes, increasing complexity and manufacturing costs. Therefore, we evaluated novel gas-permeable culture devices (G-Rex) with a silicone membrane at the base allowing gas exchange to occur uninhibited by depth of medium above. This system effectively supports the expansion of CTL and actually increases output by up to 20-fold while decreasing required technician time. Importantly, this amplified cell expansion is not due to more cell divisions but to reduced cell death. This bioprocess optimization increased T-cell output while decreasing the complexity and cost of CTL manufacture, making cell therapy more accessible. PMID:20445351
Building flexible real-time systems using the Flex language
NASA Technical Reports Server (NTRS)
Kenny, Kevin B.; Lin, Kwei-Jay
1991-01-01
The design and implementation of a real-time programming language called Flex, which is a derivative of C++, are presented. It is shown how different types of timing requirements might be expressed and enforced in Flex, how they might be fulfilled in a flexible way using different program models, and how the programming environment can help in making binding and scheduling decisions. The timing constraint primitives in Flex are easy to use yet powerful enough to define both independent and relative timing constraints. Program models like imprecise computation and performance polymorphism can carry out flexible real-time programs. In addition, programmers can use a performance measurement tool that produces statistically correct timing models to predict the expected execution time of a program and to help make binding decisions. A real-time programming environment is also presented.
Henshaw, Erin J; Fried, Rachel; Teeters, Jenni Beth; Siskind, Emily E
2014-09-01
Several predictors of postpartum mood have been identified in the literature, but the role of maternal expectations in postpartum mental health remains unclear. The aim of this study was to identify whether maternal expectations during the postpartum hospital stay predict adjustment and depressive symptoms at 6 weeks postpartum. The sample included 233 first-time mothers recruited from the postpartum unit of a Midwestern hospital. Participants completed measures of maternal expectations and depressive symptoms (EPDS) at Time 1 (2 d postpartum) and completed EPDS and an Emotional Adjustment Scale (BaM-13) at Time 2 (6 weeks postpartum). A conditional relationship between the expectation that an infant's behavior will reflect maternal skill and Time 2 outcomes (BaM-13 and EPDS) was found, such that endorsing this belief predicted increased depression and poorer adjustment in those with higher (but not lower) Time 1 EPDS scores. Time 2 BaM-13 scores were also negatively predicted by expectations of self-sacrifice and positively predicted by expectations that parenthood would be naturally fulfilling. The expectations that new mothers hold about parenting soon after delivery are predictive of emotional adjustment in the early postpartum period, suggesting a role for discussion of expectations in future preventive strategies.
NWS Operational Requirements for Ensemble-Based Hydrologic Forecasts
NASA Astrophysics Data System (ADS)
Hartman, R. K.
2008-12-01
Ensemble-based hydrologic forecasts have been developed and issued by National Weather Service (NWS) staff at River Forecast Centers (RFCs) for many years. Used principally for long-range water supply forecasts, only the uncertainty associated with weather and climate have been traditionally considered. As technology and societal expectations of resource managers increase, the use and desire for risk-based decision support tools has also increased. These tools require forecast information that includes reliable uncertainty estimates across all time and space domains. The development of reliable uncertainty estimates associated with hydrologic forecasts is being actively pursued within the United States and internationally. This presentation will describe the challenges, components, and requirements for operational hydrologic ensemble-based forecasts from the perspective of a NOAA/NWS River Forecast Center.
Longitudinal analyses of adoptive parents' expectations and depressive symptoms.
Foli, Karen J; Lim, Eunjung; South, Susan C
2017-12-01
Grounded in a theoretical model specific to adoptive parents, we examined the relationship between parental expectations and depressive symptoms across time. Assessments of 129 adoptive parents of 64 children were performed at three time points before and after placement of an adopted child with the family: 4-6 weeks pre-placement and 4-6 weeks and 5-6 months post-placement. Expectations were assessed in four dimensions: expectations of self as parents, of the child, of family and friends, and of society. Depressive symptoms were assessed with the Center for Epidemiologic Studies-Depression scale. Associations between parental expectations and depressive symptoms were analyzed, and longitudinal multilevel modeling was conducted to explore influences on expectations over time. Parental expectations changed from pre- to post-placement. With the exception of expectations of self as parent, adoptive parents' pre-adoption expectations were affirmed in the post-adoption time periods. In each expectation dimension, higher affirmation of expectations was correlated with decreased depressive symptoms before and after placement of a child. While parental expectations are not unique to adoptive parents, the essence and characteristics of certain expectations are unique to these parents. When working with adoptive parents, nurses who care for families should assess expectations both pre- and post-placement with awareness of their relationship to depressive symptoms. © 2017 Wiley Periodicals, Inc.
Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (ORION)
NASA Technical Reports Server (NTRS)
Mott, Diana L.; Bigler, Mark A.
2017-01-01
NASA uses two HRA assessment methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is still expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a PRA model that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more problematic. In order to determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules and operational requirements are developed and then finalized.
Resident Autonomy in the Operating Room: Expectations Versus Reality.
Meyerson, Shari L; Sternbach, Joel M; Zwischenberger, Joseph B; Bender, Edward M
2017-09-01
There is concern about graduating thoracic trainees' independent operative skills due to limited autonomy in training. This study compared faculty and trainee expected levels of autonomy with intraoperative measurements of autonomy for common cardiothoracic operations. Participants underwent frame-of-reference training on the 4-point Zwisch scale of operative autonomy (show and tell → active help → passive help → supervision only) and evaluated autonomy in actual cases using the Zwisch Me!! mobile application. A separate "expected autonomy" survey elicited faculty and resident perceptions of how much autonomy a resident should have for six common operations: decortication, wedge resection, thoracoscopic lobectomy, coronary artery bypass grafting, aortic valve replacement, and mitral valve repair. Thirty-three trainees from 7 institutions submitted evaluations of 596 cases over 18 months (March 2015 to September 2016). Thirty attendings subsequently provided their evaluation of 476 of those cases (79.9% response rate). Expected autonomy surveys were completed by 21 attendings and 19 trainees from 5 institutions. The six operations included in the survey constituted 47% (226 of 476) of the cases evaluated. Trainee and attending expectations did not differ significantly for senior trainees. Both groups expected significantly higher levels of autonomy than observed in the operating room for all six types of cases. Although faculty and trainees both expect similar levels of autonomy in the operating room, real-time measurements of autonomy show a gap between expectations and reality. Decreasing this gap will require a concerted effort by both faculty and residents to focus on the development of independent operative skills. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
What do stakeholders expect from patient engagement: Are these expectations being met?
Boudes, Mathieu; Robinson, Paul; Bertelsen, Neil; Brooke, Nicholas; Hoos, Anton; Boutin, Marc; Geissler, Jan; Sargeant, Ify
2018-06-01
Meaningful patient engagement (PE) in medicines development and during the life cycle of a product requires all stakeholders have a clear understanding of respective expectations. A qualitative survey was undertaken to understand stakeholder expectations. The survey explored 4 themes from the perspective of each stakeholder group: meaning, views, expectations and priorities for PE. Participants were grouped into 7 categories: policymakers/regulators; health-care professionals (HCPs); research funders; payers/purchasers/HTA; patients/patient representatives; pharmaceutical/life sciences industry; and academic researchers. Fifty-nine interviews were conducted across a range of geographies, PE experience and job seniority/role. There was consensus across stakeholders on meaning of PE; importance of promoting PE to a higher level than currently; need for a more structured process and guidance. There was little consensus on stakeholder expectations and roles. Policymakers/regulators were expected by others to drive PE, create a framework and facilitate PE, provide guidelines of good practice and connect stakeholders, but this expectation was not shared by the policymakers/regulators group. HCPs were seen as the link between patients and other stakeholders, but HCPs did not necessarily share this view. Despite broad stakeholder categories, clear themes emerged: there is no "leader"; no stakeholder has a clear view on how to meaningfully engage with patients; there are educational gaps; and a structure and guidance for PE is urgently required. Given the diversity of stakeholders, there needs to be multistakeholder collaborative leadership. Effective collaboration requires consensus on roles, responsibilities and expectations to synergize efforts to deliver meaningful PE in medicines life cycle. © 2018 The Authors. Health Expectations published by John Wiley & Sons Ltd.
Screening for Learning and Memory Mutations: A New Approach.
Gallistel, C R; King, A P; Daniel, A M; Freestone, D; Papachristos, E B; Balci, F; Kheifets, A; Zhang, J; Su, X; Schiff, G; Kourtev, H
2010-01-30
We describe a fully automated, live-in 24/7 test environment, with experimental protocols that measure the accuracy and precision with which mice match the ratio of their expected visit durations to the ratio of the incomes obtained from two hoppers, the progress of instrumental and classical conditioning (trials-to-acquisition), the accuracy and precision of interval timing, the effect of relative probability on the choice of a timed departure target, and the accuracy and precision of memory for the times of day at which food is available. The system is compact; it obviates the handling of the mice during testing; it requires negligible amounts of experimenter/technician time; and it delivers clear and extensive results from 3 protocols within a total of 7-9 days after the mice are placed in the test environment. Only a single 24-hour period is required for the completion of first protocol (the matching protocol), which is strong test of temporal and spatial estimation and memory mechanisms. Thus, the system permits the extensive screening of many mice in a short period of time and in limited space. The software is publicly available.
Phase II Trials for Heterogeneous Patient Populations with a Time-to-Event Endpoint.
Jung, Sin-Ho
2017-07-01
In this paper, we consider a single-arm phase II trial with a time-to-event end-point. We assume that the study population has multiple subpopulations with different prognosis, but the study treatment is expected to be similarly efficacious across the subpopulations. We review a stratified one-sample log-rank test and present its sample size calculation method under some practical design settings. Our sample size method requires specification of the prevalence of subpopulations. We observe that the power of the resulting sample size is not very sensitive to misspecification of the prevalence.
1991-08-01
being used in both current and long-range research programs that are expected to make the Army more effective in matching the requirements for first- and... make substantial improvements to the existing selection and classifi- cation system. xi IMPROVING THE SELECTION, CLASSIFICATION, AND UTILIZATION OF...basis for new methods of allocating personnel, and making near-real-time decisions on the best match between characteristics of an individual enlistee
Data acquisition system for operational earth observation missions
NASA Technical Reports Server (NTRS)
Deerwester, J. M.; Alexander, D.; Arno, R. D.; Edsinger, L. E.; Norman, S. M.; Sinclair, K. F.; Tindle, E. L.; Wood, R. D.
1972-01-01
The data acquisition system capabilities expected to be available in the 1980 time period as part of operational Earth observation missions are identified. By data acquisition system is meant the sensor platform (spacecraft or aircraft), the sensors themselves and the communication system. Future capabilities and support requirements are projected for the following sensors: film camera, return beam vidicon, multispectral scanner, infrared scanner, infrared radiometer, microwave scanner, microwave radiometer, coherent side-looking radar, and scatterometer.
Don't ban PVC: incinerate and recycle it instead!
Menke, Doris; Fiedler, Hiltrud; Zwahr, Heiner
2003-04-01
Plastics are making a growing contribution to sustainable development. For example, over an expected lifetime of 50 years, the use of window frames and insulating materials made of plastic in buildings save many times the energy required to manufacture them. Plastics for packaging purposes provide protection against damage and dirt contamination, thereby saving considerable amounts of material and energy. Choosing appropriate disposal strategies for plastic waste also helps to protect the environment (Mark 2000).
Marino, Michael J
2018-05-01
There is a clear perception in the literature that there is a crisis in reproducibility in the biomedical sciences. Many underlying factors contributing to the prevalence of irreproducible results have been highlighted with a focus on poor design and execution of experiments along with the misuse of statistics. While these factors certainly contribute to irreproducibility, relatively little attention outside of the specialized statistical literature has focused on the expected prevalence of false discoveries under idealized circumstances. In other words, when everything is done correctly, how often should we expect to be wrong? Using a simple simulation of an idealized experiment, it is possible to show the central role of sample size and the related quantity of statistical power in determining the false discovery rate, and in accurate estimation of effect size. According to our calculations, based on current practice many subfields of biomedical science may expect their discoveries to be false at least 25% of the time, and the only viable course to correct this is to require the reporting of statistical power and a minimum of 80% power (1 - β = 0.80) for all studies. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Crockett, Lisa J.; Beal, Sarah J.
2012-01-01
Adolescents' expectations about the timing of adult role transitions have the potential to shape their actual transitions, setting the stage for their adult lives. Although expectations about timing emerge by early adolescence, little is known about how these expectations develop across adolescence. This longitudinal study examined developmental…
Global patterns of drought recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwalm, Christopher R.; Anderegg, William R. L.; Michalak, Anna M.
Drought is a recurring multi-factor phenomenon with major impacts on natural and human systems1-3. Drought is especially important for land carbon sink variability, influencing climate regulation of the terrestrial biosphere4. While 20th Century trends in drought regime are ambiguous, “more extreme extremes” as well as more frequent and severe droughts3,7 are expected in the 21st Century. Recovery time, the length of time an ecosystem requires to revert to its pre-drought functional state, is a critical metric of drought impact. Yet the spatiotemporal patterning and controls of drought recovery are largely unknown. Here we use three distinct global datasets of grossmore » primary productivity to show that across diverse terrestrial ecosystems drought recovery times are driven by biological productivity and biodiversity, with drought length and severity of secondary importance. Recovery time, especially for extreme droughts, and the areal extent of ecosystems in recovery from drought generally increase over the 20th Century, supporting an increase globally in drought impact8. Our results indicate that if future Anthropocene droughts become more widespread as expected, that droughts will become more frequent relative to recovery time. This increases the risk of entering a new regime where vegetation never recovers to its original state and widespread degradation of the land carbon sink ensues.« less
Global patterns of drought recovery
NASA Astrophysics Data System (ADS)
Schwalm, Christopher R.; Anderegg, William R. L.; Michalak, Anna M.; Fisher, Joshua B.; Biondi, Franco; Koch, George; Litvak, Marcy; Ogle, Kiona; Shaw, John D.; Wolf, Adam; Huntzinger, Deborah N.; Schaefer, Kevin; Cook, Robert; Wei, Yaxing; Fang, Yuanyuan; Hayes, Daniel; Huang, Maoyi; Jain, Atul; Tian, Hanqin
2017-08-01
Drought, a recurring phenomenon with major impacts on both human and natural systems, is the most widespread climatic extreme that negatively affects the land carbon sink. Although twentieth-century trends in drought regimes are ambiguous, across many regions more frequent and severe droughts are expected in the twenty-first century. Recovery time—how long an ecosystem requires to revert to its pre-drought functional state—is a critical metric of drought impact. Yet the factors influencing drought recovery and its spatiotemporal patterns at the global scale are largely unknown. Here we analyse three independent datasets of gross primary productivity and show that, across diverse ecosystems, drought recovery times are strongly associated with climate and carbon cycle dynamics, with biodiversity and CO2 fertilization as secondary factors. Our analysis also provides two key insights into the spatiotemporal patterns of drought recovery time: first, that recovery is longest in the tropics and high northern latitudes (both vulnerable areas of Earth’s climate system) and second, that drought impacts (assessed using the area of ecosystems actively recovering and time to recovery) have increased over the twentieth century. If droughts become more frequent, as expected, the time between droughts may become shorter than drought recovery time, leading to permanently damaged ecosystems and widespread degradation of the land carbon sink.
NASA Astrophysics Data System (ADS)
Lacour, D.
2018-02-01
The expected increase of the particle flux at the high luminosity phase of the LHC (HL-LHC) with instantaneous luminosities up to 7.5ṡ1034 cm-2s-1 will have a severe impact on the ATLAS detector performance. The pile-up is expected to increase on average to 200 interactions per bunch crossing. The reconstruction performance for electrons, photons as well as jets and transverse missing energy will be severely degraded in the end-cap and forward region. A High Granularity Timing Detector (HGTD) is proposed in front of the liquid Argon end-cap and forward calorimeters for pile-up mitigation. This device should cover the pseudo-rapidity range of 2.4 to about 4.0. Low Gain Avalanche Detectors (LGAD) technology has been chosen as it provides an internal gain good enough to reach large signal over noise ratio needed for excellent time resolution. The requirements and overall specifications of the High Granular Timing Detector at the HL-LHC will be presented as well as the conceptual design of its mechanics and electronics. Beam test results and measurements of irradiated LGAD silicon sensors, such as gain and timing resolution, will be shown.
Operation and performance of the Ciba-Corning 512 coagulation monitor during parabolic flight
NASA Technical Reports Server (NTRS)
Gocke, Robyn; Lloyd, Charles W.; Greenthaner, Nancy K.
1991-01-01
The goal was to assess the functionality and evaluate the procedures and operations required to operate the Ciba-Corning 512 Coagulation Monitor during parabolic flight. This monitor determines the clotting characteristics of blood. The analyzer operates by laser detection of the cessation of blood flow in a capillary channel within a test cartridge. Test simulator results were excellent for both pre-and post-flight. In-flight results were not obtained due to the warm-up time required for the simulator. Since this is an electronic function only, the expected results on the simulator would be the same in zero-g.
Using neural networks to represent potential surfaces as sums of products.
Manzhos, Sergei; Carrington, Tucker
2006-11-21
By using exponential activation functions with a neural network (NN) method we show that it is possible to fit potentials to a sum-of-products form. The sum-of-products form is desirable because it reduces the cost of doing the quadratures required for quantum dynamics calculations. It also greatly facilitates the use of the multiconfiguration time dependent Hartree method. Unlike potfit product representation algorithm, the new NN approach does not require using a grid of points. It also produces sum-of-products potentials with fewer terms. As the number of dimensions is increased, we expect the advantages of the exponential NN idea to become more significant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner
When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rationalmore » DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201« less
A Search for Transits of Proxima b in MOST Photometry
NASA Astrophysics Data System (ADS)
Kipping, David M.
2017-01-01
The recent discovery of a potentially rocky planet in the habitable-zone of our nearest star presents exciting prospects for future detailed characterization of another world. If Proxima b transits its star, the road to characterization would be considerably eased. In 2014 and 2015, we monitored Proxima Centauri with the Canadian space telescope MOST for a total of 43 days. As expected, the star presents considerable photometric variability due to flares, which greatly complicate our analysis. Using Gaussian process regression and Bayesian model selection with informative priors for the time of transit of Proxima b, we do find evidence for a transit of the expected depth. However, relaxing the prior on the transit time to an uninformative one returns a distinct solution highlighting the high false-positive rate induced by flaring. Using ground-based photometry from HATSouth, we show that our candidate transit is unlikely to be genuine although a conclusive answer will likely require infrared photometry, such as that from Spitzer, where flaring should be suppressed.
Gemini primary mirror in situ wash
NASA Astrophysics Data System (ADS)
Vucina, Tomislav; Boccas, Maxime; Araya, Claudio; Ah Hee, Clayton; Cavedoni, Chas
2008-07-01
The Gemini twins were the first large modern telescopes to receive protected silver coatings on their mirrors in 2004. The low emissivity requirement is fundamental for the IR optimization. In the mid-IR a factor of two reduction in telescope emissivity is equivalent to increasing the collecting area by the same factor. Our emissivity maintenance requirement is very stringent: 0.5% maximum degradation during operations, at any single wavelength beyond 2.2 μm. We developed a very rigorous standard to wash the primary mirrors in the telescope without science down time. The in-situ washes are made regularly, and the reflectivity and emissivity gains are significant. The coating lifetime has been extended far more than our original expectations. In this report we describe the in-situ process and hardware, explain our maintenance plan, and show results of the coating performance over time.
Modeling thermoelastic distortion of optics using elastodynamic reciprocity
NASA Astrophysics Data System (ADS)
King, Eleanor; Levin, Yuri; Ottaway, David; Veitch, Peter
2015-07-01
Thermoelastic distortion resulting from optical absorption by transmissive and reflective optics can cause unacceptable changes in optical systems that employ high-power beams. In advanced-generation laser-interferometric gravitational wave detectors, for example, optical absorption is expected to result in wavefront distortions that would compromise the sensitivity of the detector, thus necessitating the use of adaptive thermal compensation. Unfortunately, these systems have long thermal time constants, and so predictive feed-forward control systems could be required, but the finite-element analysis is computationally expensive. We describe here the use of the Betti-Maxwell elastodynamic reciprocity theorem to calculate the response of linear elastic bodies (optics) to heating that has arbitrary spatial distribution. We demonstrate, using a simple example, that it can yield accurate results in computational times that are significantly less than those required for finite-element analyses.
Recall of Others' Actions after Incidental Encoding Reveals Episodic-like Memory in Dogs.
Fugazza, Claudia; Pogány, Ákos; Miklósi, Ádám
2016-12-05
The existence of episodic memory in non-human animals is a debated topic that has been investigated using different methodologies that reflect diverse theoretical approaches to its definition. A fundamental feature of episodic memory is recalling after incidental encoding, which can be assessed if the recall test is unexpected [1]. We used a modified version of the "Do as I Do" method [2], relying on dogs' ability to imitate human actions, to test whether dogs can rely on episodic memory when recalling others' actions from the past. Dogs were first trained to imitate human actions on command. Next, they were trained to perform a simple training exercise (lying down), irrespective of the previously demonstrated action. This way, we substituted their expectation to be required to imitate with the expectation to be required to lie down. We then tested whether dogs recalled the demonstrated actions by unexpectedly giving them the command to imitate, instead of lying down. Dogs were tested with a short (1 min) and a long (1 hr) retention interval. They were able to recall the demonstrated actions after both intervals; however, their performance declined more with time compared to conditions in which imitation was expected. These findings show that dogs recall past events as complex as human actions even if they do not expect the memory test, providing evidence for episodic-like memory. Dogs offer an ideal model to study episodic memory in non-human species, and this methodological approach allows investigating memory of complex, context-rich events. VIDEO ABSTRACT. Copyright © 2016 Elsevier Ltd. All rights reserved.
How long do centenarians survive? Life expectancy and maximum lifespan.
Modig, K; Andersson, T; Vaupel, J; Rau, R; Ahlbom, A
2017-08-01
The purpose of this study was to explore the pattern of mortality above the age of 100 years. In particular, we aimed to examine whether Scandinavian data support the theory that mortality reaches a plateau at particularly old ages. Whether the maximum length of life increases with time was also investigated. The analyses were based on individual level data on all Swedish and Danish centenarians born from 1870 to 1901; in total 3006 men and 10 963 women were included. Birth cohort-specific probabilities of dying were calculated. Exact ages were used for calculations of maximum length of life. Whether maximum age changed over time was analysed taking into account increases in cohort size. The results confirm that there has not been any improvement in mortality amongst centenarians in the past 30 years and that the current rise in life expectancy is driven by reductions in mortality below the age of 100 years. The death risks seem to reach a plateau of around 50% at the age 103 years for men and 107 years for women. Despite the rising life expectancy, the maximum age does not appear to increase, in particular after accounting for the increasing number of individuals of advanced age. Mortality amongst centenarians is not changing despite improvements at younger ages. An extension of the maximum lifespan and a sizeable extension of life expectancy both require reductions in mortality above the age of 100 years. © 2017 The Association for the Publication of the Journal of Internal Medicine.
Forde, C G; van Kuijk, N; Thaler, T; de Graaf, C; Martin, N
2013-01-01
The modern food supply is often dominated by a large variety of energy dense, softly textured foods that can be eaten quickly. Previous studies suggest that particular oral processing characteristics such as large bite size and lack of chewing activity contribute to the low satiating efficiency of these foods. To better design meals that promote greater feelings of satiation, we need an accurate picture of the oral processing characteristics of a range of solid food items that could be used to replace softer textures during a normal hot meal. The primary aim of this study was to establish an accurate picture of the oral processing characteristics of a set of solid savoury meal components. The secondary aim was to determine the associations between oral processing characteristics, food composition, sensory properties, and expected satiation. In a within subjects design, 15 subjects consumed 50 g of 35 different savoury food items over 5 sessions. The 35 foods represented various staples, vegetables and protein rich foods such a meat and fish. Subjects were video-recorded during consumption and measures included observed number of bites, number of chews, number of swallows and derived measures such as chewing rate, eating rate, bite size, and oral exposure time. Subjects rated expected satiation for a standard 200 g portion of each food using a 100mm and the sensory differences between foods were quantified using descriptive analysis with a trained sensory panel. Statistical analysis focussed on the oral processing characteristics and associations between nutritional, sensory and expected satiation parameters of each food. Average number of chews for 50 g of food varied from 27 for mashed potatoes to 488 for tortilla chips. Oral exposure time was highly correlated with the total number of chews, and varied from 27 s for canned tomatoes to 350 s for tortilla chips. Chewing rate was relatively constant with an overall average chewing rate of approximately 1 chew/s. Differences in oral processing were not correlated with any macronutrients specifically. Expected satiation was positively related to protein and the sensory attributes chewiness and saltiness. Foods that consumed in smaller bites, were chewed more and for longer and expected to impart a higher satiation. This study shows a large and reliable variation in oral exposure time, number of required chews before swallowing and expected satiation across a wide variety of foods. We conclude that bite size and oral-sensory exposure time could contribute to higher satiation within a meal for equal calories. Copyright © 2012 Elsevier Ltd. All rights reserved.
SKOOG, GARY R.; CIECKA, JAMES E.
2010-01-01
Retirement-related concepts are treated as random variables within Markov process models that capture multiple labor force entries and exits. The expected number of years spent outside of the labor force, expected years in retirement, and expected age at retirement are computed—all of which are of immense policy interest but have been heretofore reported with less precisely measured proxies. Expected age at retirement varies directly with a person’s age; but even younger people can expect to retire at ages substantially older than those commonly associated with retirement, such as age 60, 62, or 65. Between 1970 and 2003, men allocated most of their increase in life expectancy to increased time in retirement, but women allocated most of their increased life expectancy to labor force activity. Although people can exit and reenter the labor force at older ages, most 65-year-old men who are active in the labor force will not reenter after they eventually exit. At age 65, the probability that those who are inactive will reenter the labor force at some future time is .38 for men and .27 for women. Life expectancy at exact ages is decomposed into the sum of the expected time spent active and inactive in the labor force, and also as the sum of the expected time to labor force separation and time in retirement. PMID:20879680
Changing life expectancy in central Europe: is there a single reason?
Chenet, L; McKee, M; Fulop, N; Bojan, F; Brand, H; Hort, A; Kalbarczyk, P
1996-09-01
During the 1980s, at a time that life expectancy at birth in western Europe has increased by 2.5 years, it has stagnated or, for some groups, declined in the former socialist countries of central and eastern Europe. A study was carried out to ascertain the contribution of deaths at different age groups and from different causes to changes in life expectancy at birth in Czechoslovakia, Hungary and Poland between 1979 and 1990. Improvements in infant mortality have been counteracted by deteriorating death rates among young and middle-aged people, with the deterioration commencing as young as late childhood in Hungary but in the thirties or forties in Czechoslovakia and Poland. The leading contributors to this deterioration are cancer and circulatory disease but, in Hungary, cirrhosis and accidents have also been of great importance. The patterns observed in each country differ in the age groups affected and the causes of death. Further work is required to explain these differences.
Cognitive task analysis-based design and authoring software for simulation training.
Munro, Allen; Clark, Richard E
2013-10-01
The development of more effective medical simulators requires a collaborative team effort where three kinds of expertise are carefully coordinated: (1) exceptional medical expertise focused on providing complete and accurate information about the medical challenges (i.e., critical skills and knowledge) to be simulated; (2) instructional expertise focused on the design of simulation-based training and assessment methods that produce maximum learning and transfer to patient care; and (3) software development expertise that permits the efficient design and development of the software required to capture expertise, present it in an engaging way, and assess student interactions with the simulator. In this discussion, we describe a method of capturing more complete and accurate medical information for simulators and combine it with new instructional design strategies that emphasize the learning of complex knowledge. Finally, we describe three different types of software support (Development/Authoring, Run Time, and Post Run Time) required at different stages in the development of medical simulations and the instructional design elements of the software required at each stage. We describe the contributions expected of each kind of software and the different instructional control authoring support required. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Combining coordination of motion actuators with driver steering interaction.
Tagesson, Kristoffer; Laine, Leo; Jacobson, Bengt
2015-01-01
A new method is suggested for coordination of vehicle motion actuators; where driver feedback and capabilities become natural elements in the prioritization. The method is using a weighted least squares control allocation formulation, where driver characteristics can be added as virtual force constraints. The approach is in particular suitable for heavy commercial vehicles that in general are over actuated. The method is applied, in a specific use case, by running a simulation of a truck applying automatic braking on a split friction surface. Here the required driver steering angle, to maintain the intended direction, is limited by a constant threshold. This constant is automatically accounted for when balancing actuator usage in the method. Simulation results show that the actual required driver steering angle can be expected to match the set constant well. Furthermore, the stopping distance is very much affected by this set capability of the driver to handle the lateral disturbance, as expected. In general the capability of the driver to handle disturbances should be estimated in real-time, considering driver mental state. By using the method it will then be possible to estimate e.g. stopping distance implied from this. The setup has the potential of even shortening the stopping distance, when the driver is estimated as active, this compared to currently available systems. The approach is feasible for real-time applications and requires only measurable vehicle quantities for parameterization. Examples of other suitable applications in scope of the method would be electronic stability control, lateral stability control at launch and optimal cornering arbitration.
Mother-caregiver expectations for function among survivors of childhood brain tumors
Barakat, Lamia P.; Ulrich, Connie M.; Jones, Nora L.; Deatrick, Janet A.
2015-01-01
Purpose Children diagnosed with brain tumors increasingly survive to adulthood, although they do so with needs often requiring continued parental caregiving. We sought to describe the nature of caregivers’ expectations about survivors’ function and how expectations connect to ongoing management and decision-making. Methods Forty-five qualitative interviews with mother-caregivers were conducted and coded for themes related to expectations for their adolescent/young adult children living post-childhood brain tumors. Results Five main themes emerged as integral to mother-caregiver expectations: realizing a difference in the survivor, noticing limitations to independence in the survivor, memories of learning about clinical prognoses as understood from consent meetings and education, managing these realizations, and acknowledging unresolved challenges. Conclusions Caregiver expectations are influenced by both initial clinical interactions and contemporary family dynamics and require individual- and family-specific survivorship planning. As caregiver expectations can influence management behaviors that impact outcomes and possibly independence, implications for clinician-caregiver shared decision-making are substantial. PMID:26556212
Henriksen, Niel M.; Roe, Daniel R.; Cheatham, Thomas E.
2013-01-01
Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 microseconds of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations. PMID:23477537
Henriksen, Niel M; Roe, Daniel R; Cheatham, Thomas E
2013-04-18
Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example, by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 μs of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations.
Modeling operators' emergency response time for chemical processing operations.
Murray, Susan L; Harputlu, Emrah; Mentzer, Ray A; Mannan, M Sam
2014-01-01
Operators have a crucial role during emergencies at a variety of facilities such as chemical processing plants. When an abnormality occurs in the production process, the operator often has limited time to either take corrective actions or evacuate before the situation becomes deadly. It is crucial that system designers and safety professionals can estimate the time required for a response before procedures and facilities are designed and operations are initiated. There are existing industrial engineering techniques to establish time standards for tasks performed at a normal working pace. However, it is reasonable to expect the time required to take action in emergency situations will be different than working at a normal production pace. It is possible that in an emergency, operators will act faster compared to a normal pace. It would be useful for system designers to be able to establish a time range for operators' response times for emergency situations. This article develops a modeling approach to estimate the time standard range for operators taking corrective actions or following evacuation procedures in emergency situations. This will aid engineers and managers in establishing time requirements for operators in emergency situations. The methodology used for this study combines a well-established industrial engineering technique for determining time requirements (predetermined time standard system) and adjustment coefficients for emergency situations developed by the authors. Numerous videos of workers performing well-established tasks at a maximum pace were studied. As an example, one of the tasks analyzed was pit crew workers changing tires as quickly as they could during a race. The operations in these videos were decomposed into basic, fundamental motions (such as walking, reaching for a tool, and bending over) by studying the videos frame by frame. A comparison analysis was then performed between the emergency pace and the normal working pace operations to determine performance coefficients. These coefficients represent the decrease in time required for various basic motions in emergency situations and were used to model an emergency response. This approach will make hazardous operations requiring operator response, alarm management, and evacuation processes easier to design and predict. An application of this methodology is included in the article. The time required for an emergency response was roughly a one-third faster than for a normal response time.
Molecular processes in a high temperature shock layer
NASA Technical Reports Server (NTRS)
Guberman, S. L.
1984-01-01
Models of the shock layer encountered by an Aeroassisted Orbital Transfer Vehicle require as input accurate cross sections and rate constants for the atomic and molecular processes that characterize the shock radiation. From the estimated atomic and molecular densities in the shock layer and the expected residence time of 1 m/s, it can be expected that electron-ion collision processes will be important in the shock model. Electron capture by molecular ions followed by dissociation, e.g., O2(+) + e(-) yields 0 + 0, can be expected to be of major importance since these processes are known to have high rates (e.g., 10 to the -7th power cu/cm/sec) at room temperature. However, there have been no experimental measurements of dissociative recombination (DR) at temperatures ( 12000K) that are expected to characterize the shock layer. Indeed, even at room temperature, it is often difficult to perform experiments that determine the dependence of the translational energy and quantum yields of the product atoms on the electronic and vibrational state of the reactant molecular ions. Presented are ab initio quantum chemical studies of DR for molecular ions that are likely to be important in the atmospheric shock layer.
IT solutions for privacy protection in biobanking.
Eder, J; Gottweis, H; Zatloukal, K
2012-01-01
Biobanks containing human biological samples and associated data are key resources for the advancement of medical research. Efficient access to samples and data increases competitiveness in medical research, reduces effort and time for achieving scientific results and promotes scientific progress. In order to address upcoming health challenges, there is increasing need for transnational collaboration. This requires innovative solutions improving interoperability of biobanks in fields such as sample and data management as well as governance including ethical and legal frameworks. In this context, rights and expectations of donors to determine the usage of their biological material and data and to ensure their privacy have to be observed. We discuss the benefits of biobanks, the needs to support medical research and the societal demands and regulations, in particular, securing the rights of donors and present IT solutions that allow both to maintain the security of personal data and to increase the efficiency of access to data in biobanks. Disclosure filters are discussed as a strategy to combine European public expectations concerning informed consent with the requirements of biobank research. Copyright © 2012 S. Karger AG, Basel.
The Phase-2 electronics upgrade of the ATLAS liquid argon calorimeter system
NASA Astrophysics Data System (ADS)
Vachon, B.
2018-03-01
The LHC high-luminosity upgrade in 2024-2026 requires the associated detectors to operate at luminosities about 5-7 times larger than assumed in their original design. The pile-up is expected to increase to up to 200 events per proton bunch-crossing. The current readout of the ATLAS liquid argon calorimeters does not provide sufficient buffering and bandwidth capabilities to accommodate the hardware triggers requirements imposed by these harsh conditions. Furthermore, the expected total radiation doses are beyond the qualification range of the current front-end electronics. For these reasons an almost complete replacement of the front-end and off-detector readout system is foreseen for the 182,468 readout channels. The new readout system will be based on a free-running architecture, where calorimeter signals are amplified, shaped and digitized by on-detector electronics, then sent at 40 MHz to the off-detector electronics for further processing. Results from the design studies on the performance of the components of the readout system are presented, as well as the results of the tests of the first prototypes.
Kim, Sunwook; Nussbaum, Maury A; Mokhlespour Esfahani, Mohammad Iman; Alemi, Mohammad Mehdi; Alabdulkarim, Saad; Rashedi, Ehsan
2018-03-07
Use of exoskeletal vests (designed to support overhead work) can be an effective intervention approach for tasks involving arm elevation, yet little is known on the potential beneficial impacts of their use on physical demands and task performance. This laboratory study (n = 12) evaluated the effects of a prototype exoskeletal vest during simulated repetitive overhead drilling and light assembly tasks. Anticipated or expected benefits were assessed, in terms of perceived discomfort, shoulder muscle activity, and task performance. Using the exoskeletal vest did not substantially influence perceived discomfort, but did decrease normalized shoulder muscle activity levels (e.g., ≤ 45% reduction in peak activity). Drilling task completion time decreased by nearly 20% with the vest, but the number of errors increased. Overall, exoskeletal vest use has the potential to be a new intervention for work requiring arm elevation; however, additional investigations are needed regarding potential unexpected or adverse influences (see Part II). Copyright © 2018 Elsevier Ltd. All rights reserved.
Quality of life and patients' expectations in soft tissue sarcoma.
Jones, Robin L; Cesne, Axel Le
2018-05-01
Assessment of health-related quality of life (HRQoL) is essential for holistic care. Greater efforts are required to incorporate HRQoL measures into clinical trials and daily practice. Considerable HRQoL data are available for localized soft tissue sarcomas (STS), particularly in the orthopedic setting. In future, HRQoL is expected to become increasingly important in the evaluation of palliative therapy in advanced STS. A patient-centric approach is advocated for STS management. Greater awareness of STS by nonspecialist clinicians, and timely referral to specialized sarcoma reference centers, is crucial for patient welfare. The patient is central to shared decision-making during consultations and during case review in tumor boards. The management approach to STS should be collaborative, involving a multidisciplinary team, multiple centers and patient advocacy groups.
Managing laboratory automation in a changing pharmaceutical industry
Rutherford, Michael L.
1995-01-01
The health care reform movement in the USA and increased requirements by regulatory agencies continue to have a major impact on the pharmaceutical industry and the laboratory. Laboratory management is expected to improve effciency by providing more analytical results at a lower cost, increasing customer service, reducing cycle time, while ensuring accurate results and more effective use of their staff. To achieve these expectations, many laboratories are using robotics and automated work stations. Establishing automated systems presents many challenges for laboratory management, including project and hardware selection, budget justification, implementation, validation, training, and support. To address these management challenges, the rationale for project selection and implementation, the obstacles encountered, project outcome, and learning points for several automated systems recently implemented in the Quality Control Laboratories at Eli Lilly are presented. PMID:18925014
Tests of Flammability of Cotton Fabrics and Expected Skin Burns in Microgravity
NASA Technical Reports Server (NTRS)
Cavanagh, Jane M.; Torvi, David A.; Gabriel, Kamiel S.; Ruff, Gary A.
2004-01-01
During a shuttle launch and other portions of space flight, astronauts wear specialized flame resistant clothing. However during most of their missions on board the Space Shuttle or International Space Station, astronauts wear ordinary clothing, such as cotton shirts and pants. As the behaviour of flames is considerably different in microgravity than under earth s gravity, fabrics are expected to burn in a different fashion in microgravity than when tested on earth. There is interest in determining how this change in burning behaviour may affect times to second and third degree burn of human skin, and how the results of standard fabric flammability tests conducted under earth s gravity correlate with the expected fire behaviour of textiles in microgravity. A new experimental apparatus was developed to fit into the Spacecraft Fire Safety Facility (SFSF), which is used on NASA s KC-135 low gravity aircraft. The new apparatus was designed to be similar to the apparatus used in standard vertical flammability tests of fabrics. However, rather than using a laboratory burner, the apparatus uses a hot wire system to ignite 200 mm high by 80 mm wide fabric specimens. Fabric temperatures are measured using thermocouples and/or an infrared imaging system, while flame spread rates are measured using real time observations or video. Heat flux gauges are placed between 7 and 13 mm away from the fabric specimen, so that heat fluxes from the burning fabric to the skin can be estimated, along with predicted times required to produce skin burns.
Poliovirus vaccination during the endgame: insights from integrated modeling.
Duintjer Tebbens, Radboud J; Thompson, Kimberly M
2017-06-01
Managing the polio endgame requires access to sufficient quantities of poliovirus vaccines. After oral poliovirus vaccine (OPV) cessation, outbreaks may occur that require outbreak response using monovalent OPV (mOPV) and/or inactivated poliovirus vaccine. Areas covered: We review the experience and challenges with managing vaccine supplies in the context of the polio endgame. Building on models that explored polio endgame risks and the potential mOPV needs to stop outbreaks from live poliovirus reintroductions, we conceptually explore the potential demands for finished and bulk mOPV doses from a stockpile in the context of limited shelf-life of finished vaccine and time delays to convert bulk to finished vaccine. Our analysis suggests that the required size of the mOPV stockpile varies by serotype, with the highest expected needs for serotype 1 mOPV. Based on realizations of poliovirus risks after OPV cessation, the stockpile required to eliminate the chance of a stock-out appears considerably larger than the currently planned mOPV stockpiles. Expert commentary: The total required stockpile size depends on the acceptable probability of a stock-out, and increases with longer times to finish bulk doses and shorter shelf-lives of finished doses. Successful polio endgame management will require careful attention to poliovirus vaccine supplies.
NASA Technical Reports Server (NTRS)
Boriakoff, Valentin
1994-01-01
The goal of this project was the feasibility study of a particular architecture of a digital signal processing machine operating in real time which could do in a pipeline fashion the computation of the fast Fourier transform (FFT) of a time-domain sampled complex digital data stream. The particular architecture makes use of simple identical processors (called inner product processors) in a linear organization called a systolic array. Through computer simulation the new architecture to compute the FFT with systolic arrays was proved to be viable, and computed the FFT correctly and with the predicted particulars of operation. Integrated circuits to compute the operations expected of the vital node of the systolic architecture were proven feasible, and even with a 2 micron VLSI technology can execute the required operations in the required time. Actual construction of the integrated circuits was successful in one variant (fixed point) and unsuccessful in the other (floating point).
Estimating Nitrogen Load Resulting from Biofuel Mandates
Alshawaf, Mohammad; Douglas, Ellen; Ricciardi, Karen
2016-01-01
The Energy Policy Act of 2005 and the Energy Independence and Security Act (EISA) of 2007 were enacted to reduce the U.S. dependency on foreign oil by increasing the use of biofuels. The increased demand for biofuels from corn and soybeans could result in an increase of nitrogen flux if not managed properly. The objectives of this study are to estimate nitrogen flux from energy crop production and to identify the catchment areas with high nitrogen flux. The results show that biofuel production can result in an increase of nitrogen flux to the northern Gulf of Mexico from 270 to 1742 thousand metric tons. Using all cellulosic (hay) ethanol or biodiesel to meet the 2022 mandate is expected to reduce nitrogen flux; however, it requires approximately 25% more land when compared to other scenarios. Producing ethanol from switchgrass rather than hay results in three-times more nitrogen flux, but requires 43% less land. Using corn ethanol for 2022 mandates is expected to have double the nitrogen flux when compared to the EISA-specified 2022 scenario; however, it will require less land area. Shifting the U.S. energy supply from foreign oil to the Midwest cannot occur without economic and environmental impacts, which could potentially lead to more eutrophication and hypoxia. PMID:27171101
Estimating Nitrogen Load Resulting from Biofuel Mandates.
Alshawaf, Mohammad; Douglas, Ellen; Ricciardi, Karen
2016-05-09
The Energy Policy Act of 2005 and the Energy Independence and Security Act (EISA) of 2007 were enacted to reduce the U.S. dependency on foreign oil by increasing the use of biofuels. The increased demand for biofuels from corn and soybeans could result in an increase of nitrogen flux if not managed properly. The objectives of this study are to estimate nitrogen flux from energy crop production and to identify the catchment areas with high nitrogen flux. The results show that biofuel production can result in an increase of nitrogen flux to the northern Gulf of Mexico from 270 to 1742 thousand metric tons. Using all cellulosic (hay) ethanol or biodiesel to meet the 2022 mandate is expected to reduce nitrogen flux; however, it requires approximately 25% more land when compared to other scenarios. Producing ethanol from switchgrass rather than hay results in three-times more nitrogen flux, but requires 43% less land. Using corn ethanol for 2022 mandates is expected to have double the nitrogen flux when compared to the EISA-specified 2022 scenario; however, it will require less land area. Shifting the U.S. energy supply from foreign oil to the Midwest cannot occur without economic and environmental impacts, which could potentially lead to more eutrophication and hypoxia.
Optimal dynamic remapping of parallel computations
NASA Technical Reports Server (NTRS)
Nicol, David M.; Reynolds, Paul F., Jr.
1987-01-01
A large class of computations are characterized by a sequence of phases, with phase changes occurring unpredictably. The decision problem was considered regarding the remapping of workload to processors in a parallel computation when the utility of remapping and the future behavior of the workload is uncertain, and phases exhibit stable execution requirements during a given phase, but requirements may change radically between phases. For these problems a workload assignment generated for one phase may hinder performance during the next phase. This problem is treated formally for a probabilistic model of computation with at most two phases. The fundamental problem of balancing the expected remapping performance gain against the delay cost was addressed. Stochastic dynamic programming is used to show that the remapping decision policy minimizing the expected running time of the computation has an extremely simple structure. Because the gain may not be predictable, the performance of a heuristic policy that does not require estimnation of the gain is examined. The heuristic method's feasibility is demonstrated by its use on an adaptive fluid dynamics code on a multiprocessor. The results suggest that except in extreme cases, the remapping decision problem is essentially that of dynamically determining whether gain can be achieved by remapping after a phase change. The results also suggest that this heuristic is applicable to computations with more than two phases.
Raths, David
2009-01-01
Focus the grant application on addressing content areas, such as chronic disease management, that have an essential IT component, rather than on the technology itself. Community partnerships are much more likely to win grant funding than individual organizations. Take the time to understand the goals, interests and requirements of the funding agencies. Each has its own expectations about measurable results and about ongoing support after grant funding runs out.
Training Family Medicine Residents to Perform Home Visits: A CERA Survey.
Sairenji, Tomoko; Wilson, Stephen A; D'Amico, Frank; Peterson, Lars E
2017-02-01
Home visits have been shown to improve quality of care, save money, and improve outcomes. Primary care physicians are in an ideal position to provide these visits; of note, the Accreditation Council for Graduate Medical Education no longer requires home visits as a component of family medicine residency training. To investigate changes in home visit numbers and expectations, attitudes, and approaches to training among family medicine residency program directors. This research used the Council of Academic Family Medicine Educational Research Alliance (CERA) national survey of family medicine program directors in 2015. Questions addressed home visit practices, teaching and evaluation methods, common types of patient and visit categories, and barriers. There were 252 responses from 455 possible respondents, representing a response rate of 55%. At most programs, residents performed 2 to 5 home visits by graduation in both 2014 (69% of programs, 174 of 252) and 2015 (68%, 172 of 252). The vast majority (68%, 172 of 252) of program directors expect less than one-third of their graduates to provide home visits after graduation. Scheduling difficulties, lack of faculty time, and lack of resident time were the top 3 barriers to residents performing home visits. There appeared to be no decline in resident-performed home visits in family medicine residencies 1 year after they were no longer required. Family medicine program directors may recognize the value of home visits despite a lack of few formal curricula.
Phloem water relations and translocation.
Kaufmann, M R; Kramer, P J
1967-02-01
Satisfactory measurements of phloem water potential of trees can be obtained with the Richards and Ogata psychrometer and the vapor equilibration techniques, although corrections for loss of dry weight and for heating by respiration are required for the vapor equilibrium values. The psychrometer technique is the more satisfactory of the 2 because it requires less time for equilibration, less tissue, and less handling of tissue. Phloem water potential of a yellow-poplar tree followed a diurnal pattern quite similar to that of leaves, except that the values were higher (less negative) and changed less than in the leaves.The psychrometer technique permits a different approach to the study of translocation in trees. Measurements of water potential of phloem discs followed by freezing of samples and determination of osmotic potential allows estimation of turgor pressure in various parts of trees as the difference between osmotic potential and total water potential. This technique was used in evaluating gradients in water potential, osmotic potential, and turgor pressure in red maple trees. The expected gradients in osmotic potential were observed in the phloem, osmotic potential of the cell sap increasing (sap becoming more dilute) down the trunk. However, values of water potential were such that a gradient in turgor pressure apparently did not exist at a time when rate of translocation was expected to be high. These results do not support the mass flow theory of translocation favored by many workers.
Phloem Water Relations and Translocation 1
Kaufmann, Merrill R.; Kramer, Paul J.
1967-01-01
Satisfactory measurements of phloem water potential of trees can be obtained with the Richards and Ogata psychrometer and the vapor equilibration techniques, although corrections for loss of dry weight and for heating by respiration are required for the vapor equilibrium values. The psychrometer technique is the more satisfactory of the 2 because it requires less time for equilibration, less tissue, and less handling of tissue. Phloem water potential of a yellow-poplar tree followed a diurnal pattern quite similar to that of leaves, except that the values were higher (less negative) and changed less than in the leaves. The psychrometer technique permits a different approach to the study of translocation in trees. Measurements of water potential of phloem discs followed by freezing of samples and determination of osmotic potential allows estimation of turgor pressure in various parts of trees as the difference between osmotic potential and total water potential. This technique was used in evaluating gradients in water potential, osmotic potential, and turgor pressure in red maple trees. The expected gradients in osmotic potential were observed in the phloem, osmotic potential of the cell sap increasing (sap becoming more dilute) down the trunk. However, values of water potential were such that a gradient in turgor pressure apparently did not exist at a time when rate of translocation was expected to be high. These results do not support the mass flow theory of translocation favored by many workers. PMID:16656495
14 CFR 121.646 - En-route fuel supply: flag and supplemental operations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... for flight a turbine-engine powered airplane with more than two engines for a flight more than 90... supply requirements of § 121.333; and (iii) Considering expected wind and other weather conditions. (3..., considering wind and other weather conditions expected, it has the fuel otherwise required by this part and...
14 CFR 121.646 - En-route fuel supply: flag and supplemental operations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... for flight a turbine-engine powered airplane with more than two engines for a flight more than 90... supply requirements of § 121.333; and (iii) Considering expected wind and other weather conditions. (3..., considering wind and other weather conditions expected, it has the fuel otherwise required by this part and...
7 CFR 1753.38 - Procurement procedures.
Code of Federal Regulations, 2011 CFR
2011-01-01
... specifications and equipment requirements (required only for projects expected to exceed $500,000 or 25% of the... technology. (ii) The borrower shall review in detail all exceptions to the P&S. No exceptions will be... only for projects that are expected to exceed $500,000 or 25% of the loan, whichever is less), sealed...
7 CFR 1753.38 - Procurement procedures.
Code of Federal Regulations, 2014 CFR
2014-01-01
... specifications and equipment requirements (required only for projects expected to exceed $500,000 or 25% of the... technology. (ii) The borrower shall review in detail all exceptions to the P&S. No exceptions will be... only for projects that are expected to exceed $500,000 or 25% of the loan, whichever is less), sealed...
7 CFR 1753.38 - Procurement procedures.
Code of Federal Regulations, 2012 CFR
2012-01-01
... specifications and equipment requirements (required only for projects expected to exceed $500,000 or 25% of the... technology. (ii) The borrower shall review in detail all exceptions to the P&S. No exceptions will be... only for projects that are expected to exceed $500,000 or 25% of the loan, whichever is less), sealed...
7 CFR 1753.38 - Procurement procedures.
Code of Federal Regulations, 2013 CFR
2013-01-01
... specifications and equipment requirements (required only for projects expected to exceed $500,000 or 25% of the... technology. (ii) The borrower shall review in detail all exceptions to the P&S. No exceptions will be... only for projects that are expected to exceed $500,000 or 25% of the loan, whichever is less), sealed...
ERIC Educational Resources Information Center
Ernst, David J.; And Others
This paper examines five key trends impacting higher education administration: (1) traditional funding sources are flat or decreasing; (2) public expectations and state mandates are calling for more reporting requirements and accountability; (3) consumer expectations demand more sophisticated services requiring greater access to date; (4) evolving…
Inter and intra substation communications: Requirements and solutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adamiak, M.; Patterson, R.; Melcher, J.
1995-10-01
As the substation communication world searches for the ``Promised LAN,`` it would be helpful to have a roadmap to give direction to the search. Many are the expectations of a LAN flowing with data and able to connect with any Intelligent Electronic Device (IED) that was ever made. Such expectations must be tempered with the cost and complexity of achieving them. This paper presents an outline of the communication requirements of the myriad of IED`s in existence in the substation today as well as the expectations of what second generation microprocessor based devices might be able to do. Requirements willmore » focus on language issues, system capabilities, performance requirements, external interfaces, environmental, and quality issues. Some attention will also be given to the architecture of a solution and some guidelines as to how this structure might be built.« less
The limits of earthquake early warning: Timeliness of ground motion estimates
Minson, Sarah E.; Meier, Men-Andrin; Baltay, Annemarie S.; Hanks, Thomas C.; Cochran, Elizabeth S.
2018-01-01
The basic physics of earthquakes is such that strong ground motion cannot be expected from an earthquake unless the earthquake itself is very close or has grown to be very large. We use simple seismological relationships to calculate the minimum time that must elapse before such ground motion can be expected at a distance from the earthquake, assuming that the earthquake magnitude is not predictable. Earthquake early warning (EEW) systems are in operation or development for many regions around the world, with the goal of providing enough warning of incoming ground shaking to allow people and automated systems to take protective actions to mitigate losses. However, the question of how much warning time is physically possible for specified levels of ground motion has not been addressed. We consider a zero-latency EEW system to determine possible warning times a user could receive in an ideal case. In this case, the only limitation on warning time is the time required for the earthquake to evolve and the time for strong ground motion to arrive at a user’s location. We find that users who wish to be alerted at lower ground motion thresholds will receive more robust warnings with longer average warning times than users who receive warnings for higher ground motion thresholds. EEW systems have the greatest potential benefit for users willing to take action at relatively low ground motion thresholds, whereas users who set relatively high thresholds for taking action are less likely to receive timely and actionable information.
The limits of earthquake early warning: Timeliness of ground motion estimates
Hanks, Thomas C.
2018-01-01
The basic physics of earthquakes is such that strong ground motion cannot be expected from an earthquake unless the earthquake itself is very close or has grown to be very large. We use simple seismological relationships to calculate the minimum time that must elapse before such ground motion can be expected at a distance from the earthquake, assuming that the earthquake magnitude is not predictable. Earthquake early warning (EEW) systems are in operation or development for many regions around the world, with the goal of providing enough warning of incoming ground shaking to allow people and automated systems to take protective actions to mitigate losses. However, the question of how much warning time is physically possible for specified levels of ground motion has not been addressed. We consider a zero-latency EEW system to determine possible warning times a user could receive in an ideal case. In this case, the only limitation on warning time is the time required for the earthquake to evolve and the time for strong ground motion to arrive at a user’s location. We find that users who wish to be alerted at lower ground motion thresholds will receive more robust warnings with longer average warning times than users who receive warnings for higher ground motion thresholds. EEW systems have the greatest potential benefit for users willing to take action at relatively low ground motion thresholds, whereas users who set relatively high thresholds for taking action are less likely to receive timely and actionable information. PMID:29750190
Gargoum, Suliman A; Tawfeek, Mostafa H; El-Basyouny, Karim; Koch, James C
2018-03-01
An important element of highway design is ensuring that the available sight distance (ASD) on a highway meets driver needs. For instance, if the ASD at any point on a highway is less than the distance required to come to a complete stop after seeing a hazard (i.e. Stopping Sight Distance (SSD)), the driver will not be able to stop in time to avoid a collision. SSD is function of a number of variables which vary depending on the driver, the vehicle driven and surface conditions; examples of such variables include a driver's perception reaction time or PRT (i.e. the time required by the driver to perceive and react to a hazard) and the deceleration rate of the vehicle. Most design guides recommend deterministic values for PRT and deceleration rates. Although these values may serve the needs of the average driver, they may not satisfy the needs of drivers with limited abilities. In other words, even if the ASD exceeds required SSD defined in the design guide, it might not always satisfy the needs of all drivers. While it is impossible to design roads that satisfy the needs of all drivers, the fact that most developed countries suffer from an aging population, means that the number of old drivers on our roads is expected to increase. Since a large proportion of old drivers often have limited abilities, it is expected that the general population of drivers with limited abilities on our roads will increase with time. Accordingly, more efforts are required to ensure that existing road infrastructure is prepared to handle such a change. This paper aims to explore the extent to which ASD on highways satisfies the needs of drivers with limited abilities. The paper first develops MATLAB and Python codes to automatically estimate the ASD on highway point cloud data collected using Light Detection and Ranging (LiDAR) remote sensing technology. The developed algorithms are then used to estimate ASD on seven different crash prone segments in the Province of Alberta, Canada and the ASD is compared to the required SSD on each highway. Three different levels of SSD are defined (SSD for drivers with limited ability, AASHTOs SSD requirements and SSD for drivers with high skill). The results show that, when compared to SSD requirements which integrate limitations in cognitive abilities, a substantial portion of the analyzed segments do not meet the requirements (up to 20%). Similarly, when compared to AASHTO's SSD requirements, up to 6% of the analyzed segments do not meet the requirements. In an attempt to explore the effects of such design limitations on safety, the paper also explores crash rates in noncompliant regions (i.e. regions that do not provide sufficient SSD) and compares them to crash rates in compliant regions. On average, it was found that noncompliant regions experience crash rates that are 2.15 and 1.25 times higher than compliant regions for AASHTO's SSD requirements and those integrating driver limitations, respectively. Furthermore, the study found that a significantly higher proportion of drivers involved in collisions in the noncompliant regions were old drivers. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berres, Anne Sabine; Adhinarayanan, Vignesh; Turton, Terece
2017-05-12
Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline atmore » the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.« less
A U.S. Strategy for Timely Fusion Energy Development
NASA Astrophysics Data System (ADS)
Wade, Mickey
2017-10-01
Worldwide energy demand is expected to explode in the latter half of this century. In anticipation of this demand, the U.S. DOE recently asked the National Academy of Science to provide guidance on a long-term strategic plan assuming that ``economical fusion energy within the next several decades is a U.S. strategic interest. ``Delivering on such a plan will require an R&D program that delivers key data and understanding on the building blocks of a) burning plasma physics, b) optimization of the coupled core-edge solution, and c) fusion nuclear science to inform the design of a cost-attractive DEMO reactor in this time frame. Such a program should leverage existing facilities in the U.S. program including ITER, provide substantive motivation for an expanding R&D scope (and funding), and enable timely redirection of resources within the program as appropriate (and endorsed by DOE and the fusion community). This paper will outline a potential strategy that provides world-leading opportunities for the research community in a range of areas while delivering on key milestones required for timely fusion energy development. Supported by General Atomics internal funding.
NASA Astrophysics Data System (ADS)
Meng, X. T.; Levin, D. S.; Chapman, J. W.; Li, D. C.; Yao, Z. E.; Zhou, B.
2017-02-01
The High Performance Time to Digital Converter (HPTDC), a multi-channel ASIC designed by the CERN Microelectronics group, has been proposed for the digitization of the thin-Resistive Plate Chambers (tRPC) in the ATLAS Muon Spectrometer Phase-1 upgrade project. These chambers, to be staged for higher luminosity LHC operation, will increase trigger acceptance and reduce or eliminate the fake muon trigger rates in the barrel-endcap transition region, corresponding to pseudo-rapidity range 1<|η|<1.3. Low level trigger candidates must be flagged within a maximum latency of 1075 ns, thus imposing stringent signal processing time performance requirements on the readout system in general, and on the digitization electronics in particular. This paper investigates the HPTDC signal latency performance based on a specially designed evaluation board coupled with an external FPGA evaluation board, when operated in triggerless mode, and under hit rate conditions expected in Phase-I. This hardware based study confirms previous simulations and demonstrates that the HPTDC in triggerless operation satisfies the digitization timing requirements in both leading edge and pair modes.
Li, Jian; Bloch, Pavel; Xu, Jing; Sarunic, Marinko V; Shannon, Lesley
2011-05-01
Fourier domain optical coherence tomography (FD-OCT) provides faster line rates, better resolution, and higher sensitivity for noninvasive, in vivo biomedical imaging compared to traditional time domain OCT (TD-OCT). However, because the signal processing for FD-OCT is computationally intensive, real-time FD-OCT applications demand powerful computing platforms to deliver acceptable performance. Graphics processing units (GPUs) have been used as coprocessors to accelerate FD-OCT by leveraging their relatively simple programming model to exploit thread-level parallelism. Unfortunately, GPUs do not "share" memory with their host processors, requiring additional data transfers between the GPU and CPU. In this paper, we implement a complete FD-OCT accelerator on a consumer grade GPU/CPU platform. Our data acquisition system uses spectrometer-based detection and a dual-arm interferometer topology with numerical dispersion compensation for retinal imaging. We demonstrate that the maximum line rate is dictated by the memory transfer time and not the processing time due to the GPU platform's memory model. Finally, we discuss how the performance trends of GPU-based accelerators compare to the expected future requirements of FD-OCT data rates.
2010-09-29
The Food and Drug Administration (FDA) is amending its regulations governing safety reporting requirements for human drug and biological products subject to an investigational new drug application (IND). The final rule codifies the agency's expectations for timely review, evaluation, and submission of relevant and useful safety information and implements internationally harmonized definitions and reporting standards. The revisions will improve the utility of IND safety reports, reduce the number of reports that do not contribute in a meaningful way to the developing safety profile of the drug, expedite FDA's review of critical safety information, better protect human subjects enrolled in clinical trials, subject bioavailability and bioequivalence studies to safety reporting requirements, promote a consistent approach to safety reporting internationally, and enable the agency to better protect and promote public health.
Determining medical staffing requirements for humanitarian assistance missions.
Negus, Tracy L; Brown, Carrie J; Konoske, Paula
2010-01-01
The primary mission of hospital ships is to provide acute medical and surgical services to U.S. forces during military operations. Hospital ships also provide a hospital asset in support of disaster relief and humanitarian assistance (HA) operations. HA missions afford medical care to populations with vastly different sets of medical conditions from combat casualty care, which affects staffing requirements. Information from a variety of sources was reviewed to better understand hospital ship HA missions. Factors such as time on-site and location shape the mission and underlying goals. Patient encounter data from previous HA missions were used to determine expected patient conditions encountered in various HA operations. These data points were used to project the medical staffing required for future missions. Further data collection, along with goal setting, must be performed to accomplish successful future HA missions. Refining staffing requirements allows deployments to accomplish needed HA and effectively reach underserved areas.
The Case of Expectant Fathers: Negotiating the Changing Role of Males in a "Female" World
ERIC Educational Resources Information Center
Hinckley, C.; Ferreira, R.; Maree, J. G.
2007-01-01
Research was carried out to investigate the needs of expectant fathers and to determine whether television can be implemented to provide parent guidance to South African expectant fathers during the transition into fatherhood. Focus was on understanding the specific type of information required by expectant fathers, in conjunction with their…
Dai, Wei; Fu, Caroline; Khant, Htet A; Ludtke, Steven J; Schmid, Michael F; Chiu, Wah
2014-11-01
Advances in electron cryotomography have provided new opportunities to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase-contrast optics produces images with markedly increased contrast compared with images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods for obtaining 3D structures of cyanophage assembly intermediates in the host by subtomogram alignment, classification and averaging. Acquiring three or four tomographic tilt series takes ∼12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. The time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume.
Wynell-Mayow, William; Saeed, Muhammad Zahid
2018-03-14
The WHO includes osteoarthritis as a disease of priority, owing to its significant impact on quality of life, and globally increasing prevalence. Hospital budgets are under pressure to ration knee replacements and shorten inpatient stays. Prolonged tourniquet application has been hypothesised to extend recovery through pain and reduced mobility. A total of 123 elective total knee replacements meeting inclusion criteria took place from July 2015 to October 2017 at the Royal Free Hospital. Cases were standardised by method of TKR, implant, physiotherapy and analgesic regime according to the trust Enhanced Recovery after Surgery pathway. Tourniquet time was compared to length-of-stay post-operatively and total opioid analgesia requirement over 24 h. Median tourniquet time overall was 74 min and was decreased year-on-year from 108 to 60 min (p = 0.000). Inpatient median length-of-stay was 5 days and did not decrease (p = 0.667). Increased tourniquet time was not associated with longer length-of-stay but in fact shorter (p = 0.03199), likely due to this confounding temporal trend. Increased tourniquet time was not associated with increased opioid requirement (p = 0.78591). No association was found between tourniquet time and other complications including DVT and infection. Our study finds no evidence that reductions in tourniquet time in TKR improve recovery including length-of-stay or opioid requirement. This clinical data is expected to augment PROMs collected by the National Joint Registry.
OPENMED: A facility for biomedical experiments based on the CERN Low Energy Ion Ring (LEIR)
NASA Astrophysics Data System (ADS)
Carli, Christian
At present protons and carbon ions are in clinical use for hadron therapy at a growing number of treatment centers all over the world. Nevertheless, only limited direct clinical evidence of their superiority over other forms of radiotherapy is available [1]. Furthermore fundamental studies on biological effects of hadron beams have been carried out at different times (some a long time ago) in different laboratories and under different conditions. Despite an increased availability of ion beams for hadron therapy, beam time for preclinical studies is expected to remain insufficient as the priority for therapy centers is to treat the maximum number of patients. Most of the remaining beam time is expected to be required for setting up and measurements to guarantee appropriate good quality beams for treatments. The proposed facility for biomedical research [2] in support of hadron therapy centers would provide ion beams for interested research groups and allow them to carry out basic studies under well defined conditions. Typical studies would include radiobiological phenomena like relative biological effectiveness with different energies, ion species, and intensities. Furthermore possible studies include the development of advanced dosimetry in heterogeneous materials that resemble the human body, imaging techniques and, at a later stage, when the maximum energy with the LEIR magnets can be reached, fragmentation.
Patient-Centered Bedside Rounds and the Clinical Examination.
Lichstein, Peter R; Atkinson, Hal H
2018-05-01
Bedside hospital rounds promote patient-centered care in teaching and nonteaching settings. Patients and families prefer bedside rounds and provider acceptance is increasing. Efficient bedside rounds with an interprofessional team or with learners requires preparation of the patient and the rounding team. Bedside "choreography" provides structure and sets expectations for time spent in the room. By using relationship-centered communication, rounds can be both patient proximate and patient centered. The clinical examination can be integrated into the flow of the presentation and case discussion. Patient and provider experience can be enhanced through investing time at the bedside. Copyright © 2018 Elsevier Inc. All rights reserved.
Analysis of Accelerometer Data from a Woven Inflatable Creep Burst Test
NASA Technical Reports Server (NTRS)
James, George H.; Grygier, Michael; Selig, Molly M.
2015-01-01
Accelerometers were used to montor an inflatable test article during a creep test to failure. The test article experienced impulse events that were classified based on the response of the sensors and their time-dependent manifestation. These impulse events required specialized techniques to process the structural dynamics data. However, certain phenomena were defined as worthy of additional study. An assessment of one phenomena (a frequency near 1000Hz) showed a time dependent frequency and an amplitude that increased significantly near the end of the test. Hence, these observations are expected to drive future understanding of and utility in inflatable space structures.
Space station human productivity study. Volume 5: Management plans
NASA Technical Reports Server (NTRS)
1985-01-01
The 67 Management Plans represent recommended study approaches for resolving 108 of the 305 Issues which were identified. Each study Management Plan is prepared in three formats: Management Plan Overview (lists the subsumed Issues, study background, and related overview information); Study Plan (details the study approach by tasks, lists special needs, and describes expected study products); Schedule-Task Flow (provides a time-lined schedule for the study tasks and resource requirements). The Management Relationships Matrix, included in this volume, shows the data input-output relationships among all recommended studies. A listing is also included which cross-references the unresolved requirements to Issues to management plans. A glossary of all abbreviations utilized is provided.
Viktorov, A A; Zharinov, G M; Neklasova, N Ju; Morozova, E E
2017-01-01
The article presents a methodical approach for prediction of life expectancy for people diagnosed with prostate cancer based on the kinetic theory of aging of living systems. The life expectancy is calculated by solving the differential equation for the rate of aging for three different stage of life - «normal» life, life with prostate cancer and life after combination therapy for prostate cancer. The mathematical model of aging for each stage of life has its own parameters identified by the statistical analysis of healthcare data from the Zharinov's databank and Rosstat CDR NES databank. The core of the methodical approach is the statistical correlation between growth rate of the prostate specific antigen level (PSA-level) or the PSA doubling time (PSA DT) before therapy, and lifespan: the higher the PSA DT is, the greater lifespan. The patients were grouped under the «fast PSA DT» and «slow PSA DT» categories. The satisfactory matching between calculations and experiment is shown. The prediction error of group life expectancy is due to the completeness and reliability of the main data source. A detailed monitoring of the basic health indicators throughout the each person life in each analyzed group is required. The absence of this particular information makes it impossible to predict the individual life expectancy.
Utilization of community pharmacy space to enhance privacy: a qualitative study.
Hattingh, H Laetitia; Emmerton, Lynne; Ng Cheong Tin, Pascale; Green, Catherine
2016-10-01
Community pharmacists require access to consumers' information about their medicines and health-related conditions to make informed decisions regarding treatment options. Open communication between consumers and pharmacists is ideal although consumers are only likely to disclose relevant information if they feel that their privacy requirements are being acknowledged and adhered to. This study sets out to explore community pharmacy privacy practices, experiences and expectations and the utilization of available space to achieve privacy. Qualitative methods were used, comprising a series of face-to-face interviews with 25 pharmacists and 55 pharmacy customers in Perth, Western Australia, between June and August 2013. The use of private consultation areas for certain services and sensitive discussions was supported by pharmacists and consumers although there was recognition that workflow processes in some pharmacies may need to change to maximize the use of private areas. Pharmacy staff adopted various strategies to overcome privacy obstacles such as taking consumers to a quieter part of the pharmacy, avoiding exposure of sensitive items through packaging, lowering of voices, interacting during pharmacy quiet times and telephoning consumers. Pharmacy staff and consumers regularly had to apply judgement to achieve the required level of privacy. Management of privacy can be challenging in the community pharmacy environment, and on-going work in this area is important. As community pharmacy practice is increasingly becoming more involved in advanced medication and disease state management services with unique privacy requirements, pharmacies' layouts and systems to address privacy challenges require a proactive approach. © 2015 The Authors. Health Expectations Published by John Wiley & Sons Ltd.
Chen, Ming-Kai; Menard, David H; Cheng, David W
2016-03-01
In pursuit of as-low-as-reasonably-achievable (ALARA) doses, this study investigated the minimal required radioactivity and corresponding imaging time for reliable semiquantification in PET/CT imaging. Using a phantom containing spheres of various diameters (3.4, 2.1, 1.5, 1.2, and 1.0 cm) filled with a fixed (18)F-FDG concentration of 165 kBq/mL and a background concentration of 23.3 kBq/mL, we performed PET/CT at multiple time points over 20 h of radioactive decay. The images were acquired for 10 min at a single bed position for each of 10 half-lives of decay using 3-dimensional list mode and were reconstructed into 1-, 2-, 3-, 4-, 5-, and 10-min acquisitions per bed position using an ordered-subsets expectation maximum algorithm with 24 subsets and 2 iterations and a gaussian 2-mm filter. SUVmax and SUVavg were measured for each sphere. The minimal required activity (±10%) for precise SUVmax semiquantification in the spheres was 1.8 kBq/mL for an acquisition of 10 min, 3.7 kBq/mL for 3-5 min, 7.9 kBq/mL for 2 min, and 17.4 kBq/mL for 1 min. The minimal required activity concentration-acquisition time product per bed position was 10-15 kBq/mL⋅min for reproducible SUV measurements within the spheres without overestimation. Using the total radioactivity and counting rate from the entire phantom, we found that the minimal required total activity-time product was 17 MBq⋅min and the minimal required counting rate-time product was 100 kcps⋅min. Our phantom study determined a threshold for minimal radioactivity and acquisition time for precise semiquantification in (18)F-FDG PET imaging that can serve as a guide in pursuit of achieving ALARA doses. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Optimization of extended propulsion time nuclear-electric propulsion trajectories
NASA Technical Reports Server (NTRS)
Sauer, C. G., Jr.
1981-01-01
This paper presents the methodology used in optimizing extended propulsion time NEP missions considering realistic thruster lifetime constraints. These missions consist of a powered spiral escape from a 700-km circular orbit at the earth, followed by a powered heliocentric transfer with an optimized coast phase, and terminating in a spiral capture phase at the target planet. This analysis is most applicable to those missions with very high energy requirements such as outer planet orbiter missions or sample return missions where the total propulsion time could greatly exceed the expected lifetime of an individual thruster. This methodology has been applied to the investigation of NEP missions to the outer planets where examples are presented of both constrained and optimized trajectories.
Radiotherapy Monte Carlo simulation using cloud computing technology.
Poole, C M; Cornelius, I; Trapp, J V; Langton, C M
2012-12-01
Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.
Design of Ablation Test Device for Brick Coating of Gun
NASA Astrophysics Data System (ADS)
shirui, YAO; yongcai, CHEN; fei, WANG; jianxin, ZHAO
2018-03-01
As a result of the live ammunition test conditions, the barrel resistance of the barrel coating has high cost, time consuming, low efficiency and high test site requirements. This article designed a simple, convenient and efficient test device. Through the internal trajectory calculation by Matlab, the ablation environment produced by the ablation test device has achieved the expected effect, which is consistent with the working condition of the tube in the launching state, which can better reflect the ablation of the coating.
Microcomputer programming skills
NASA Technical Reports Server (NTRS)
Barth, C. W.
1979-01-01
Some differences in skill and techniques required for conversion from programmer to microprogrammer are discussed. The primary things with which the programmer should work are hardware architecture, hardware/software trade off, and interfacing. The biggest differences, however, will stem from the differences in applications than from differences in machine size. The change to real-time programming is the most important of these differences, particularly on dedicated microprocessors. Another primary change is programming with a more computer-naive user in mind, and dealing with his limitations and expectations.
The Hubble Space Telescope high speed photometer
NASA Technical Reports Server (NTRS)
Vancitters, G. W., Jr.; Bless, R. C.; Dolan, J. F.; Elliot, J. L.; Robinson, E. L.; White, R. L.
1988-01-01
The Hubble Space Telescope will provide the opportunity to perform precise astronomical photometry above the disturbing effects of the atmosphere. The High Speed Photometer is designed to provide the observatory with a stable, precise photometer with wide dynamic range, broad wavelenth coverage, time resolution in the microsecond region, and polarimetric capability. Here, the scientific requirements for the instrument are examined, the unique design features of the photometer are explored, and the improvements to be expected over the performance of ground-based instruments are projected.
Automated Synthesis of Architecture of Avionic Systems
NASA Technical Reports Server (NTRS)
Chau, Savio; Xu, Joseph; Dang, Van; Lu, James F.
2006-01-01
The Architecture Synthesis Tool (AST) is software that automatically synthesizes software and hardware architectures of avionic systems. The AST is expected to be most helpful during initial formulation of an avionic-system design, when system requirements change frequently and manual modification of architecture is time-consuming and susceptible to error. The AST comprises two parts: (1) an architecture generator, which utilizes a genetic algorithm to create a multitude of architectures; and (2) a functionality evaluator, which analyzes the architectures for viability, rejecting most of the non-viable ones. The functionality evaluator generates and uses a viability tree a hierarchy representing functions and components that perform the functions such that the system as a whole performs system-level functions representing the requirements for the system as specified by a user. Architectures that survive the functionality evaluator are further evaluated by the selection process of the genetic algorithm. Architectures found to be most promising to satisfy the user s requirements and to perform optimally are selected as parents to the next generation of architectures. The foregoing process is iterated as many times as the user desires. The final output is one or a few viable architectures that satisfy the user s requirements.
Screening for Learning and Memory Mutations: A New Approach
Gallistel, C. R.; King, A. P.; Daniel, A. M.; Freestone, D.; Papachristos, E. B.; Balci, F.; Kheifets, A.; Zhang, J.; Su, X.; Schiff, G.; Kourtev, H.
2010-01-01
We describe a fully automated, live-in 24/7 test environment, with experimental protocols that measure the accuracy and precision with which mice match the ratio of their expected visit durations to the ratio of the incomes obtained from two hoppers, the progress of instrumental and classical conditioning (trials-to-acquisition), the accuracy and precision of interval timing, the effect of relative probability on the choice of a timed departure target, and the accuracy and precision of memory for the times of day at which food is available. The system is compact; it obviates the handling of the mice during testing; it requires negligible amounts of experimenter/technician time; and it delivers clear and extensive results from 3 protocols within a total of 7–9 days after the mice are placed in the test environment. Only a single 24-hour period is required for the completion of first protocol (the matching protocol), which is strong test of temporal and spatial estimation and memory mechanisms. Thus, the system permits the extensive screening of many mice in a short period of time and in limited space. The software is publicly available. PMID:20352069
NASA Technical Reports Server (NTRS)
Foyle, David C.; Hooey, Becky L.; Bakowski, Deborah L.
2013-01-01
The results offour piloted medium-fidelity simulations investigating flight deck surface trajectory-based operations (STBO) will be reviewed. In these flight deck STBO simulations, commercial transport pilots were given taxi clearances with time and/or speed components and required to taxi to the departing runway or an intermediate traffic intersection. Under a variety of concept of operations (ConOps) and flight deck information conditions, pilots' ability to taxi in compliance with the required time of arrival (RTA) at the designated airport location was measured. ConOps and flight deck information conditions explored included: Availability of taxi clearance speed and elapsed time information; Intermediate RTAs at intermediate time constraint points (e.g., intersection traffic flow points); STBO taxi clearances via ATC voice speed commands or datal ink; and, Availability of flight deck display algorithms to reduce STBO RTA error. Flight Deck Implications. Pilot RTA conformance for STBO clearances, in the form of ATC taxi clearances with associated speed requirements, was found to be relatively poor, unless the pilot is required to follow a precise speed and acceleration/deceleration profile. However, following such a precise speed profile results in inordinate head-down tracking of current ground speed, leading to potentially unsafe operations. Mitigating these results, and providing good taxi RTA performance without the associated safety issues, is a flight deck avionics or electronic flight bag (EFB) solution. Such a solution enables pilots to meet the taxi route RTA without moment-by-moment tracking of ground speed. An avionics or EFB "error-nulling" algorithm allows the pilot to view the STBO information when the pilot determines it is necessary and when workload alloys, thus enabling the pilot to spread his/her attention appropriately and strategically on aircraft separation airport navigation, and the many other flight deck tasks concurrently required. Surface Traffic Management (STM) System Implications. The data indicate a number of implications regarding specific parameters for ATC/STM algorithm development. Pilots have a tendency to arrive at RTA points early with slow required speeds, on time for moderate speeds, and late with faster required speeds. This implies that ATC/STM algorithms should operate with middle-range speeds, similar to that of non-STBO taxi performance. Route length has a related effect: Long taxi routes increase the earliness with slow speeds and the lateness with faster speeds. This is likely due to the" open-loop" nature of the task in which the speed error compounds over a longer time with longer routes. Results showed that this may be mitigated by imposing a small number oftime constraint points each with their own RTAs effectively tuming a long route into a series of shorter routes - and thus improving RTA performance. STBO ConOps Implications. Most important is the impact that these data have for NextGen STM system ConOps development. The results of these experiments imply that it is not reasonable to expect pilots to taxi under a "Full STBO" ConOps in which pilots are expected to be at a predictable (x,y) airport location for every time (t). An STBO ConOps with a small number of intermediate time constraint points and the departing runway, however, is feasible, but only with flight deck equipage enabling the use of a display similar to the "error-nulling algorithm/display" tested.
Patients' experiences when accessing their on-line electronic patient records in primary care.
Pyper, Cecilia; Amery, Justin; Watson, Marion; Crook, Claire
2004-01-01
BACKGROUND: Patient access to on-line primary care electronic patient records is being developed nationally. Knowledge of what happens when patients access their electronic records is poor. AIM: To enable 100 patients to access their electronic records for the first time to elicit patients' views and to understand their requirements. DESIGN OF STUDY: In-depth interviews using semi-structured questionnaires as patients accessed their electronic records, plus a series of focus groups. SETTING: Secure facilities for patients to view their primary care records privately. METHOD: One hundred patients from a randomised group viewed their on-line electronic records for the first time. The questionnaire and focus groups addressed patients' views on the following topics: ease of use; confidentiality and security; consent to access; accuracy; printing records; expectations regarding content; exploitation of electronic records; receiving new information and bad news. RESULTS: Most patients found the computer technology used acceptable. The majority found viewing their record useful and understood most of the content, although medical terms and abbreviations required explanation. Patients were concerned about security and confidentiality, including potential exploitation of records. They wanted the facility to give informed consent regarding access and use of data. Many found errors, although most were not medically significant. Many expected more detail and more information. Patients wanted to add personal information. CONCLUSION: Patients have strong views on what they find acceptable regarding access to electronic records. Working in partnership with patients to develop systems is essential to their success. Further work is required to address legal and ethical issues of electronic records and to evaluate their impact on patients, health professionals and service provision. PMID:14965405
NASA Technical Reports Server (NTRS)
Arbuckle, P. Douglas; Abbott, Kathy H.; Abbott, Terence S.; Schutte, Paul C.
1998-01-01
The evolution of commercial transport flight deck configurations over the past 20-30 years and expected future developments are described. Key factors in the aviation environment are identified that the authors expect will significantly affect flight deck designers. One of these is the requirement for commercial aviation accident rate reduction, which is probably required if global commercial aviation is to grow as projected. Other factors include the growing incrementalism in flight deck implementation, definition of future airspace operations, and expectations of a future pilot corps that will have grown up with computers. Future flight deck developments are extrapolated from observable factors in the aviation environment, recent research results in the area of pilot-centered flight deck systems, and by considering expected advances in technology that are being driven by other than aviation requirements. The authors hypothesize that revolutionary flight deck configuration changes will be possible with development of human-centered flight deck design methodologies that take full advantage of commercial and/or entertainment-driven technologies.
MNE Scan: Software for real-time processing of electrophysiological data.
Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph
2018-06-01
Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.
[Innovations in education for the digital student].
Koopman, P; Vervoorn, J M
2012-06-01
A significant percentage of today's teaching staff received their professional training before the revolution in information and communication technology took place. Students, by contrast, are so-called 'digital natives': they grew up surrounded by digital technology. Present day students are used to multi-tasking and expect to be facilitated in using educationalfacilities regardless of time and place. Adapting higher education to present day students' study behaviour and expectations requires reconsideration of educationalform and methods. Several types of staff can be distinguished in their attitude towards technological innovation in education. Among them are staff who are reluctant in accepting innovations. Dental schools face the challenge of finding supportfor innovations with all their teaching staff and to better adapt to the twenty-first century student. In order to introduce technological innovations successfully, students need to become involved and sufficient attention must be paid to qualifying instructors.
Hawk, Skyler T; Keijsers, Loes; Hale, William W; Meeus, Wim
2009-08-01
Privacy coordination between adolescents and their parents is difficult, as adolescents' changing roles require adjustments to expectations about family boundaries. Adolescents' perceptions of privacy invasion likely provoke conflicts with parents, but higher levels of conflict may also foster invasion perceptions. This longitudinal study assessed relations between privacy invasion and conflict frequency among adolescents, mothers, and fathers (N = 309). Bidirectional relations were present; all reports showed that invasion provoked conflict in later adolescence, but the timing and direction of conflict-to-invasion relations differed between respondents and measurement waves. The findings suggest a functional role for conflict in adolescent-parent privacy negotiations, in that it both draws attention to discrepant expectations and provides youths with a means of directly managing perceived boundary violations. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
First Mass Measurement of a 'Domestic' Microlens
NASA Astrophysics Data System (ADS)
Dong, Subo; Carey, Sean; Gould, Andrew; Zhu, Wei
2017-11-01
We propose to combine Spitzer, Gaia, and ground-based measurements to determine the mass, distance, and transverse velocity of the 'domestic' microlensing event J0507+2447. This is only the second 'domestic' event (microlensed source distance less than about 1 kpc) ever discovered, but this number is already 10 times higher than the number that are expected. Hence, determining the nature of these lenses would resolve a major puzzle. The low expected rate is what caused Einstein to delay publication of his microlensing idea by 24 years. By very good fortune, Spitzer's narrow 38 day window of observations overlaps magnified portions of the event. To determine the mass requires to measure both the 'microlens parallax' (courtesy of Spitzer) and the 'angular Einstein radius' (which can be derived from Gaia astrometry). Thus, this is a truly rare opportunity to probe the nature of 'domestic' microlenses.
Dynamics of Galaxy Clusters and Expectations from Astro-H
NASA Technical Reports Server (NTRS)
Markevitch, Maxim
2012-01-01
Galaxy clusters span a range of dynamical states, from violent mergers -- the most energetic events in the Universe -- to systems near hydrostatic equilibrium that allow us to map their dark matter distribution using X-ray observations of the intracluster gas. Accurate knowledge of the cluster physics, and in particular, the physics of the hot intracluster gas, is required to realize the full potential of clusters as cosmological probes. So far, we have been studying the cluster dynamics indirectly, deducing merger geometries, cluster masses, etc., using X-ray brightness and gas temperature mapping. For the first time, the calorimeter onboard Astro-H will provide direct measurements of line-of-sight velocities and turbulent broadening in the intracluster gas, testing many of our key assumptions about clusters. This talk will summarize expectations for cluster dynamic studies with this new instrument.
NASA Astrophysics Data System (ADS)
Brown, S.; Hine, N.; Sixsmith, A.; Garner, P.
The UK population is ageing. At the time of the 2001 census there were 8.1 million people aged over 65 living in the UK, 3.1 million of them living alone. By 2011 the number of over 65s is projected to reach just under 12 million, and by 2026 over 13 million [1]. The extra workload this will place on health and care services will be compounded by political ambitions aimed at meeting the challenges of rising patient expectations [2]. In addition to this, the Department of Health aims to promote the independence of older people by providing enhanced services from the National Health Service (NHS) and councils to prevent unnecessary hospital admission [3]. As a result we can expect to see a continuing rise in the number of elderly people living at home and requiring good-quality health and social care services.
NASA Technical Reports Server (NTRS)
Pike, Cody J.
2015-01-01
A project within SwampWorks is building a test stand to hold regolith to study how dust is ejected when exposed to the hot exhaust plume of a rocket engine. The test stand needs to be analyzed, finalized, and fabrication drawings generated to move forward. Modifications of the test stand assembly were made with Creo 2 modeling software. Structural analysis calculations were developed by hand to confirm if the structure will hold the expected loads while optimizing support positions. These calculations when iterated through MatLab demonstrated the optimized position of the vertical support to be 98'' from the far end of the stand. All remaining deflections were shown to be under the 0.6'' requirement and internal stresses to meet NASA Ground Support Equipment (GSE) Safety Standards. Though at the time of writing, fabrication drawings have yet to be generated, but are expected shortly after.
Heroic Reliability Improvement in Manned Space Systems
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2017-01-01
System reliability can be significantly improved by a strong continued effort to identify and remove all the causes of actual failures. Newly designed systems often have unexpected high failure rates which can be reduced by successive design improvements until the final operational system has an acceptable failure rate. There are many causes of failures and many ways to remove them. New systems may have poor specifications, design errors, or mistaken operations concepts. Correcting unexpected problems as they occur can produce large early gains in reliability. Improved technology in materials, components, and design approaches can increase reliability. The reliability growth is achieved by repeatedly operating the system until it fails, identifying the failure cause, and fixing the problem. The failure rate reduction that can be obtained depends on the number and the failure rates of the correctable failures. Under the strong assumption that the failure causes can be removed, the decline in overall failure rate can be predicted. If a failure occurs at the rate of lambda per unit time, the expected time before the failure occurs and can be corrected is 1/lambda, the Mean Time Before Failure (MTBF). Finding and fixing a less frequent failure with the rate of lambda/2 per unit time requires twice as long, time of 1/(2 lambda). Cutting the failure rate in half requires doubling the test and redesign time and finding and eliminating the failure causes.Reducing the failure rate significantly requires a heroic reliability improvement effort.
Vessel thermal map real-time system for the JET tokamak
NASA Astrophysics Data System (ADS)
Alves, D.; Felton, R.; Jachmich, S.; Lomas, P.; McCullen, P.; Neto, A.; Valcárcel, D. F.; Arnoux, G.; Card, P.; Devaux, S.; Goodyear, A.; Kinna, D.; Stephen, A.; Zastrow, K.-D.
2012-05-01
The installation of international thermonuclear experimental reactor-relevant materials for the plasma facing components (PFCs) in the Joint European Torus (JET) is expected to have a strong impact on the operation and protection of the experiment. In particular, the use of all-beryllium tiles, which deteriorate at a substantially lower temperature than the formerly installed carbon fiber composite tiles, imposes strict thermal restrictions on the PFCs during operation. Prompt and precise responses are therefore required whenever anomalous temperatures are detected. The new vessel thermal map real-time application collects the temperature measurements provided by dedicated pyrometers and infrared cameras, groups them according to spatial location and probable offending heat source, and raises alarms that will trigger appropriate protective responses. In the context of the JET global scheme for the protection of the new wall, the system is required to run on a 10 ms cycle communicating with other systems through the real-time data network. In order to meet these requirements a commercial off-the-shelf solution has been adopted based on standard x86 multicore technology. Linux and the multithreaded application real-time executor (MARTe) software framework were respectively the operating system of choice and the real-time framework used to build the application. This paper presents an overview of the system with particular technical focus on the configuration of its real-time capability and the benefits of the modular development approach and advanced tools provided by the MARTe framework.
NASA Technical Reports Server (NTRS)
Behling, Michael; Buchman, Donald; Marcus, Andres; Procopis, Stephanie; Wassgren, Carl; Ziemer, Sarah
1990-01-01
A proposal for an exploratory spacecraft mission to Pluto/Charon system was written in response to the request for proposal for an unmannned probe to pluto (RFP). The design requirements of the RFP are presented and under the guidance of these requirements, the spacecraft Intrepid was designed. The RPF requirement that was of primary importance is the minimization of cost. Also, the reduction of flight time was of extreme importance because the atmosphere of Pluto is expected to collapse close to the Year 2020. If intrepid should arrive after the collapse, the mission would be a failure; for Pluto would be only a solid rock of ice. The topics presented include: (1) scientific instrumentation; (2) mission management, planning, and costing; (3) power and propulsion subsystem; (4) structural subsystem; (5) command, control, and communications; and (6) attitude and articulation control.
NASA Technical Reports Server (NTRS)
St.denis, R. W.
1981-01-01
The feasibility of using optical data handling methods to transmit payload checkout and telemetry is discussed. Optical communications are superior to conventional communication systems for the following reasons: high data capacity optical channels; small and light weight optical cables; and optical signal immunity to electromagnetic interference. Task number one analyzed the ground checkout data requirements that may be expected from the payload community. Task number two selected the optical approach based on the interface requirements, the location of the interface, the amount of time required to reconfigure hardware, and the method of transporting the optical signal. Task number three surveyed and selected optical components for the two payload data link. Task number four makes a qualitative comparison of the conventional electrical communication system and the proposed optical communication system.
Integrating Aggregate Exposure Pathway (AEP) and Adverse ...
High throughput toxicity testing (HTT) holds the promise of providing data for tens of thousands of chemicals that currently have no data due to the cost and time required for animal testing. Interpretation of these results require information linking the perturbations seen in vitro with adverse outcomes in vivo and requires knowledge of how estimated exposure to the chemicals compare to the in vitro concentrations that show an effect. This abstract discusses how Adverse Outcome Pathways (AOPs) can be used to link HTT with adverse outcomes of regulatory significance and how Aggregate Exposure Pathways (AEPs) can connect concentrations of environment stressors at a source with an expected target site concentration designed to provide exposure estimates that are comparable to concentrations identified in HTT. Presentation at the ICCA-LRI and JRC Workshop: Fit-For-Purpose Exposure Assessment For Risk-Based Decision Making
ERIC Educational Resources Information Center
Wolbring, Gregor
2012-01-01
Citizenship education has been debated for some time and has faced various challenges over time. This paper introduces the lens of "ableism" and ability expectations to the citizenship education discourse. The author contends that the cultural dynamic of ability expectations and ableism (not only expecting certain abilities, but also…
Gilliam, Eric; Thompson, Megan; Vande Griend, Joseph
2017-01-01
Objective. To develop a community pharmacy-based medication therapy management (MTM) advanced pharmacy practice experience (APPE) that provides students with skills and knowledge to deliver entry-level pharmacy MTM services. Design. The University of Colorado Skaggs School of Pharmacy & Pharmaceutical Sciences (SSPPS) partnered with three community pharmacy chains to establish this three-week, required MTM APPE. Students completed the American Pharmacists Association MTM Certificate Course prior to entering the APPE. Students were expected to spend 90% or more of their time at this experience working on MTM interventions, using store MTM platforms. Assessment. All 151 students successfully completed this MTM APPE, and each received a passing evaluation from their preceptor. Preceptor evaluations of students averaged above four (entry-level practice) on a five-point Likert scale. The majority of students reported engagement in MTM services for more than 80% of the time on site. Students’ self-reporting of their ability to perform MTM interventions improved after participation in the APPE. Conclusion. The SSPPS successfully implemented a required MTM APPE, preparing students for entry-level delivery of MTM services. PMID:28381896
Mobile contract services: what you need to know.
Inman, M
2000-01-01
With sufficient planning and ongoing attention to detail, the performance of a mobile imaging service provider can exceed expectations and requirements. The relationship can prove to be mutually agreeable and profitable for many years. But, when contracting mobile services, you cannot spend too much time on initial research and detail. Several scenarios present outsourcing or mobile services as an acceptable alternative to purchase or lease: outdated equipment, novel or under-utilized technologies, the need for incrementally added or temporary service. To find suitable providers, check with peer sources in your area for recommendations; look specifically for facilities that are comparable in size and volume to your facility. Expect that larger volume facilities will rate more favorable schedules or pricing. Obtain and check references. Require mobile service providers to adhere to the same state and federal laws, rules and regulations that govern your facility; receive the assurance of compliance in writing if it is not specifically addressed in the contract. JCAHO requires that any contract service provider be governed by the same requirements as the accredited facility. Several other rules or licensing requirements may also pertain to mobile services. A prevailing reason for outsourcing imaging services is high equipment costs that cannot be justified with current volume projections. However, equipment quality should not be compromised; it must meet your needs and be in good repair. The mobile service provider you choose should be an extension of your department; quality standards must exist unilaterally. The set rule for assessing mobile service fees is that there is no set rule. There are many ways to negotiate the fee schedule so that it meets the needs of both parties. An effective marketing campaign lets physicians and patients know what you have available. Work with the mobile service provider to plan an initial announcement or open house. The mobile provider should also have patient education materials for referring physicians and your hospital. Having the mobile technologist meet with the radiologists to discuss written protocols will eliminate misunderstandings concerning expectations of both parties; ongoing communication is vital.
Search for standard model production of four top quarks in proton–proton collisions at s = 13 TeV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sirunyan, A. M.; Tumasyan, A.; Adam, W.
A search for events containing four top quarks (more » $$t\\bar{t}t\\bar{t}$$) is reported from proton-proton collisions recorded by the CMS experiment at $$\\sqrt{s}$$ = 13 TeV and corresponding to an integrated luminosity of 2.6 fb -1 inverse femtobarns. The analysis considers the single-lepton (e or $$\\mu$$)+jets and the opposite-sign dilepton (μ +μ -, μ ± e ±, or e +e -) + jets channels. It uses boosted decision trees to combine information on the global event and jet properties to distinguish between ttbar-ttbar and ttbar production. The number of events observed after all selection requirements is consistent with expectations from background and standard model signal predictions, and an upper limit is set on the cross section for ttbar-ttbar production in the standard model of 94 fb at 95% confidence level (10.2 times the prediction), with an expected limit of 118 fb. This is combined with the results from the published CMS search in the same-sign dilepton channel, resulting in an improved limit of 69 fb at 95% confidence level (7.4 times the prediction), with an expected limit of 71 fb. These are the strongest constraints on the rate of $$t\\bar{t}t\\bar{t}$$ production to date.« less
Search for standard model production of four top quarks in proton–proton collisions at s = 13 TeV
Sirunyan, A. M.; Tumasyan, A.; Adam, W.; ...
2017-06-28
A search for events containing four top quarks (more » $$t\\bar{t}t\\bar{t}$$) is reported from proton-proton collisions recorded by the CMS experiment at $$\\sqrt{s}$$ = 13 TeV and corresponding to an integrated luminosity of 2.6 fb -1 inverse femtobarns. The analysis considers the single-lepton (e or $$\\mu$$)+jets and the opposite-sign dilepton (μ +μ -, μ ± e ±, or e +e -) + jets channels. It uses boosted decision trees to combine information on the global event and jet properties to distinguish between ttbar-ttbar and ttbar production. The number of events observed after all selection requirements is consistent with expectations from background and standard model signal predictions, and an upper limit is set on the cross section for ttbar-ttbar production in the standard model of 94 fb at 95% confidence level (10.2 times the prediction), with an expected limit of 118 fb. This is combined with the results from the published CMS search in the same-sign dilepton channel, resulting in an improved limit of 69 fb at 95% confidence level (7.4 times the prediction), with an expected limit of 71 fb. These are the strongest constraints on the rate of $$t\\bar{t}t\\bar{t}$$ production to date.« less
Wieringa, Gijsbert; Zerah, Simone; Jansen, Rob; Simundic, Ana-Maria; Queralto, José; Solnica, Bogdan; Gruson, Damien; Tomberg, Karel; Riittinen, Leena; Baum, Hannsjörg; Brochet, Jean-Philippe; Buhagiar, Gerald; Charilaou, Charis; Grigore, Camelia; Johnsen, Anders H; Kappelmayer, Janos; Majkic-Singh, Nada; Nubile, Giuseppe; O'Mullane, John; Opp, Matthias; Pupure, Silvija; Racek, Jaroslav; Reguengo, Henrique; Rizos, Demetrios; Rogic, Dunja; Špaňár, Július; Štrakl, Greta; Szekeres, Thomas; Tzatchev, Kamen; Vitkus, Dalius; Wallemacq, Pierre; Wallinder, Hans
2012-08-01
Laboratory medicine's practitioners across the European community include medical, scientific and pharmacy trained specialists whose contributions to health and healthcare is in the application of diagnostic tests for screening and early detection of disease, differential diagnosis, monitoring, management and treatment of patients, and their prognostic assessment. In submitting a revised common syllabus for post-graduate education and training across the 27 member states an expectation is set for harmonised, high quality, safe practice. In this regard an extended 'Core knowledge, skills and competencies' division embracing all laboratory medicine disciplines is described. For the first time the syllabus identifies the competencies required to meet clinical leadership demands for defining, directing and assuring the efficiency and effectiveness of laboratory services as well as expectations in translating knowledge and skills into ability to practice. In a 'Specialist knowledge' division, the expectations from the individual disciplines of Clinical Chemistry/Immunology, Haematology/Blood Transfusion, Microbiology/ Virology, Genetics and In Vitro Fertilisation are described. Beyond providing a common platform of knowledge, skills and competency, the syllabus supports the aims of the European Commission in providing safeguards to increasing professional mobility across European borders at a time when demand for highly qualified professionals is increasing and the labour force is declining. It continues to act as a guide for the formulation of national programmes supplemented by the needs of individual country priorities.
Control and data acquisition upgrades for NSTX-U
Davis, W. M.; Tchilinguirian, G. J.; Carroll, T.; ...
2016-06-06
The extensive NSTX Upgrade (NSTX-U) Project includes major components which allow a doubling of the toroidal field strength to 1 T, of the Neutral Beam heating power to 12 MW, and the plasma current to 2 MA, and substantial structural enhancements to withstand the increased electromagnetic loads. The maximum pulse length will go from 1.5 to 5 s. The larger and more complex forces on the coils will be protected by a Digital Coil Protection System, which requires demanding real-time data input rates, calculations and responses. The amount of conventional digitized data for a given pulse is expected to increasemore » from 2.5 to 5 GB per second of pulse. 2-D Fast Camera data is expected to go from 2.5 GB/pulse to 10, and another 2 GB/pulse is expected from new IR cameras. Our network capacity will be increased by a factor of 10, with 10 Gb/s fibers used for the major trunks. 32-core Linux systems will be used for several functions, including between-shot data processing, MDSplus data serving, between-shot EFIT analysis, real-time processing, and for a new capability, between-shot TRANSP. As a result, improvements to the MDSplus events subsystem will be made through the use of both UDP and TCP/IP based methods and the addition of a dedicated “event server”.« less
Collisions in primordial star clusters. Formation pathway for intermediate mass black holes
NASA Astrophysics Data System (ADS)
Reinoso, B.; Schleicher, D. R. G.; Fellhauer, M.; Klessen, R. S.; Boekholt, T. C. N.
2018-06-01
Collisions were suggested to potentially play a role in the formation of massive stars in present day clusters, and have likely been relevant during the formation of massive stars and intermediate mass black holes within the first star clusters. In the early Universe, the first stellar clusters were particularly dense, as fragmentation typically only occurred at densities above 109 cm-3, and the radii of the protostars were enhanced as a result of larger accretion rates, suggesting a potentially more relevant role of stellar collisions. We present here a detailed parameter study to assess how the number of collisions and the mass growth of the most massive object depend on the properties of the cluster. We also characterize the time evolution with three effective parameters: the time when most collisions occur, the duration of the collisions period, and the normalization required to obtain the total number of collisions. We apply our results to typical Population III (Pop. III) clusters of about 1000 M⊙, finding that a moderate enhancement of the mass of the most massive star by a factor of a few can be expected. For more massive Pop. III clusters as expected in the first atomic cooling halos, we expect a more significant enhancement by a factor of 15-32. We therefore conclude that collisions in massive Pop. III clusters were likely relevant to form the first intermediate mass black holes.
Educational inequalities in mortality by cause of death: first national data for the Netherlands.
Kulhánová, Ivana; Hoffmann, Rasmus; Eikemo, Terje A; Menvielle, Gwenn; Mackenbach, Johan P
2014-10-01
Using new facilities for linking large databases, we aimed to evaluate for the first time the magnitude of relative and absolute educational inequalities in mortality by sex and cause of death in the Netherlands. We analyzed data from Dutch Labour Force Surveys (1998-2002) with mortality follow-up 1998-2007 among people aged 30-79 years. We calculated hazard ratios using Cox proportional hazards model, age-standardized mortality rates and partial life expectancy by education. We compared results for the Netherlands with those for other European countries. The relative risk of dying was about two times higher among primary educated men and women as compared to their tertiary educated counterparts, leading to a gap in partial life expectancy of 3.4 years (men) and 2.4 years (women). Inequalities in mortality are similar to those in other countries in North-Western Europe, but inequalities in lung cancer mortality are substantially larger in the Netherlands, particularly among men. The Netherlands has large inequalities in mortality, especially for smoking-related causes of death. These large inequalities require the urgent attention of policy makers.
Plasma diffusion at the magnetopause? The case of lower hybrid drift waves
NASA Technical Reports Server (NTRS)
Treumann, R. A.; Labelle, J.; Pottelette, R.; Gary, S. P.
1990-01-01
The diffusion expected from the quasilinear theory of the lower hybrid drift instability at the Earth's magnetopause is recalculated. The resulting diffusion coefficient is in principle just marginally large enough to explain the thickness of the boundary layer under quiet conditions, based on observational upper limits for the wave intensities. Thus, one possible model for the boundary layer could involve equilibrium between the diffusion arising from lower hybrid waves and various low processes. However, some recent data and simulations seems to indicate that the magnetopause is not consistent with such a soft diffusive equilibrium model. Furthermore, investigation of the nonlinear equations for the lower hybrid waves for magnetopause parameters indicates that the quasilinear state may never arise because coalescence to large wavelengths, followed by collapse once a critical wavelengths is reached, occur on a time scale faster than the quasilinear diffusion. In this case, an inhomogeneous boundary layer is to be expected. More simulations are required over longer time periods to explore whether this nonlinear evolution really takes place at the magnetopause.
Hogg, William; Kendall, Claire; Muggah, Elizabeth; Mayo-Bruinsma, Liesha; Ziebell, Laura
2014-01-01
Abstract Problem addressed A key priority in primary health care research is determining how to ensure the advancement of new family physician clinician investigators (FP-CIs). However, there is little consensus on what expectations should be implemented for new investigators to ensure the successful and timely acquisition of independent salary support. Objective of program Support new FP-CIs to maximize early career research success. Program description This program description aims to summarize the administrative and financial support provided by the C.T. Lamont Primary Health Care Research Centre in Ottawa, Ont, to early career FP-CIs; delineate career expectations; and describe the results in terms of research productivity on the part of new FP-CIs. Conclusion Family physician CI’s achieved a high level of research productivity during their first 5 years, but most did not secure external salary support. It might be unrealistic to expect new FP-CIs to be self-financing by the end of 5 years. This is a career-development program, and supporting new career FP-CIs requires a long-term investment. This understanding is critical to fostering and strengthening sustainable primary care research programs. PMID:24522688
Evaluation of GOES encoder lamps
NASA Technical Reports Server (NTRS)
Viehmann, W.; Helmold, N.
1983-01-01
Aging characteristics and life expectancies of flight quality, tungsten filament, encoder lamps are similar to those of 'commercial' grade gas filled lamps of similar construction, filament material and filament temperature. The aging and final failure by filament burnout are caused by single crystal growth over large portions of the filament with the concomitant development of facets and notches resulting in reduction of cross section and mechanical weakening of the filament. The life expectancy of presently produced lamps is about one year at their nominal operating voltage of five volts dc. At 4.5 volts, it is about two years. These life times are considerably shorter, and the degradation rates of lamp current and light flux are considerably higher, than were observed in the laboratory and in orbit on lamps of the same type manufactured more than a decade ago. It is speculated that the filaments of these earlier lamps contained a crystallization retarding dopant, possibly thorium oxide. To obtain the desired life expectancy of or = to four years in present lamps, operating voltages of or = to four volts dc would be required.
Rabl, Ari
2006-02-01
Information on life expectancy change is of great concern for policy makers, as evidenced by the discussions of the so-called "harvesting" issue (i.e. the question being, how large a loss each death corresponds to in the mortality results of time series studies). Whereas most epidemiological studies of air pollution mortality have been formulated in terms of mortality risk, this paper shows that a formulation in terms of life expectancy change is mathematically equivalent, but offers several advantages: it automatically takes into account the constraint that everybody dies exactly once, regardless of pollution; it provides a unified framework for time series, intervention studies and cohort studies; and in time series and intervention studies, it yields the life expectancy change directly as a time integral of the observed mortality rate. Results are presented for life expectancy change in time series studies. Determination of the corresponding total number of attributable deaths (as opposed to the number of observed deaths) is shown to be problematic. The time variation of mortality after a change in exposure is shown to depend on the processes by which the body can repair air pollution damage, in particular on their time constants. Hypothetical results are presented for repair models that are plausible in view of the available intervention studies of air pollution and of smoking cessation. If these repair models can also be assumed for acute effects, the results of cohort studies are compatible with those of time series. The proposed life expectancy framework provides information on the life expectancy change in time series studies, and it clarifies the relation between the results of time series, intervention, and cohort studies.
Luo, Qunying; O'Leary, Garry; Cleverly, James; Eamus, Derek
2018-06-01
Climate change (CC) presents a challenge for the sustainable development of wheat production systems in Australia. This study aimed to (1) quantify the impact of future CC on wheat grain yield for the period centred on 2030 from the perspectives of wheat phenology, water use and water use efficiency (WUE) and (2) evaluate the effectiveness of changing sowing times and cultivars in response to the expected impacts of future CC on wheat grain yield. The daily outputs of CSIRO Conformal-Cubic Atmospheric Model for baseline and future periods were used by a stochastic weather generator to derive changes in mean climate and in climate variability and to construct local climate scenarios, which were then coupled with a wheat crop model to achieve the two research aims. We considered three locations in New South Wales, Australia, six times of sowing (TOS) and three bread wheat (Triticum aestivum L.) cultivars in this study. Simulation results show that in 2030 (1) for impact analysis, wheat phenological events are expected to occur earlier and crop water use is expected to decrease across all cases (the combination of three locations, six TOS and three cultivars), wheat grain yield would increase or decrease depending on locations and TOS; and WUE would increase in most of the cases; (2) for adaptation considerations, the combination of TOS and cultivars with the highest yield varied across locations. Wheat growers at different locations will require different strategies in managing the negative impacts or taking the opportunities of future CC.
NASA Astrophysics Data System (ADS)
Luo, Qunying; O'Leary, Garry; Cleverly, James; Eamus, Derek
2018-06-01
Climate change (CC) presents a challenge for the sustainable development of wheat production systems in Australia. This study aimed to (1) quantify the impact of future CC on wheat grain yield for the period centred on 2030 from the perspectives of wheat phenology, water use and water use efficiency (WUE) and (2) evaluate the effectiveness of changing sowing times and cultivars in response to the expected impacts of future CC on wheat grain yield. The daily outputs of CSIRO Conformal-Cubic Atmospheric Model for baseline and future periods were used by a stochastic weather generator to derive changes in mean climate and in climate variability and to construct local climate scenarios, which were then coupled with a wheat crop model to achieve the two research aims. We considered three locations in New South Wales, Australia, six times of sowing (TOS) and three bread wheat ( Triticum aestivum L .) cultivars in this study. Simulation results show that in 2030 (1) for impact analysis, wheat phenological events are expected to occur earlier and crop water use is expected to decrease across all cases (the combination of three locations, six TOS and three cultivars), wheat grain yield would increase or decrease depending on locations and TOS; and WUE would increase in most of the cases; (2) for adaptation considerations, the combination of TOS and cultivars with the highest yield varied across locations. Wheat growers at different locations will require different strategies in managing the negative impacts or taking the opportunities of future CC.
NASA Astrophysics Data System (ADS)
Luo, Qunying; O'Leary, Garry; Cleverly, James; Eamus, Derek
2018-02-01
Climate change (CC) presents a challenge for the sustainable development of wheat production systems in Australia. This study aimed to (1) quantify the impact of future CC on wheat grain yield for the period centred on 2030 from the perspectives of wheat phenology, water use and water use efficiency (WUE) and (2) evaluate the effectiveness of changing sowing times and cultivars in response to the expected impacts of future CC on wheat grain yield. The daily outputs of CSIRO Conformal-Cubic Atmospheric Model for baseline and future periods were used by a stochastic weather generator to derive changes in mean climate and in climate variability and to construct local climate scenarios, which were then coupled with a wheat crop model to achieve the two research aims. We considered three locations in New South Wales, Australia, six times of sowing (TOS) and three bread wheat (Triticum aestivum L.) cultivars in this study. Simulation results show that in 2030 (1) for impact analysis, wheat phenological events are expected to occur earlier and crop water use is expected to decrease across all cases (the combination of three locations, six TOS and three cultivars), wheat grain yield would increase or decrease depending on locations and TOS; and WUE would increase in most of the cases; (2) for adaptation considerations, the combination of TOS and cultivars with the highest yield varied across locations. Wheat growers at different locations will require different strategies in managing the negative impacts or taking the opportunities of future CC.
47 CFR 101.1011 - Construction requirements and criteria for renewal expectancy.
Code of Federal Regulations, 2010 CFR
2010-10-01
... applicant involved in a comparative renewal proceeding shall receive a preference, commonly referred to as a renewal expectancy, that is the most important comparative factor to be considered in the proceeding as... comparative renewal proceeding must submit a showing explaining why it should receive a renewal expectancy. At...
47 CFR 101.1011 - Construction requirements and criteria for renewal expectancy.
Code of Federal Regulations, 2014 CFR
2014-10-01
... applicant involved in a comparative renewal proceeding shall receive a preference, commonly referred to as a renewal expectancy, that is the most important comparative factor to be considered in the proceeding as... comparative renewal proceeding must submit a showing explaining why it should receive a renewal expectancy. At...
47 CFR 101.1011 - Construction requirements and criteria for renewal expectancy.
Code of Federal Regulations, 2012 CFR
2012-10-01
... applicant involved in a comparative renewal proceeding shall receive a preference, commonly referred to as a renewal expectancy, that is the most important comparative factor to be considered in the proceeding as... comparative renewal proceeding must submit a showing explaining why it should receive a renewal expectancy. At...
47 CFR 101.1011 - Construction requirements and criteria for renewal expectancy.
Code of Federal Regulations, 2011 CFR
2011-10-01
... applicant involved in a comparative renewal proceeding shall receive a preference, commonly referred to as a renewal expectancy, that is the most important comparative factor to be considered in the proceeding as... comparative renewal proceeding must submit a showing explaining why it should receive a renewal expectancy. At...
47 CFR 101.1011 - Construction requirements and criteria for renewal expectancy.
Code of Federal Regulations, 2013 CFR
2013-10-01
... applicant involved in a comparative renewal proceeding shall receive a preference, commonly referred to as a renewal expectancy, that is the most important comparative factor to be considered in the proceeding as... comparative renewal proceeding must submit a showing explaining why it should receive a renewal expectancy. At...
Patients' expectations of asthma treatment.
Mancuso, Carol A; Rincon, Melina; Robbins, Laura; Charlson, Mary E
2003-12-01
A multicomponent model has been developed to explain patients' unmet expectations of medical care. The model proposes that expectations are related to patients' personal experiences with illness, perceived vulnerability to disease, transmitted knowledge, and perceived severity of disease. The objective of this cross-sectional study was to determine whether this model can be applied to patients' unrealistic expectations of treatment outcomes, specifically expecting to be cured of asthma. In total, 230 patients observed in a primary care practice in New York City were interviewed in person with open-ended questions about their expectations of asthma treatment. Responses were analyzed with qualitative techniques to generate categories of expectations. Patients had a mean age of 41 +/- 11 years, 21% were white, 30% African American, 42% Latino, and 7% other groups. Major categories of expectations were generated from patients' responses and included symptom relief (expected by 52%), cure (36%), improved physical function (21%), and improved psychological well-being (15%). The category of expecting a cure was assessed with patients' responses to the following items representing components of the model: 1) resource utilization and medication requirements for asthma (representing severity of disease); 2) perceived quality of asthma care and satisfaction with care (representing past asthma experiences); 3) the Asthma Self-Efficacy Scale (representing perceived vulnerability to exacerbations); and 4) experiences of social network contacts with asthma and the Check Your Asthma IQ survey (representing transmitted knowledge). In bivariate analysis, patients who expected a cure were more likely to be Latino or Native American or Asian (p = 0.02), to have never required oral corticosteroids (p = 0.004), to be dissatisfied with the status of their asthma (p = 0.008), to know others who were limited by asthma (p = 0.03), to have worse Asthma Self-Efficacy Scale scores (p = 0.002), to have worse Check Your Asthma IQ scores (p = 0.04), and to currently be taking inhaled corticosteroids (p = 0.03). In multivariate analysis, worse asthma self-efficacy (p = 0.008), never having required oral corticosteroids (p = 0.005), and currently taking inhaled corticosteroids (p = 0.05) remained associated with expecting a cure. As a result of this study, we found that patients have multiple expectations of asthma treatment, including realistic expectations such as symptom relief and improved function, as well as unrealistic expectations, specifically to be cured of asthma. A multicomponent model of patient and disease characteristics was associated with this unrealistic expectation. These findings indicate that clinicians can intervene in diverse areas to foster realistic expectations of treatment outcomes among asthma patients.
The use of AlloDerm in postmastectomy alloplastic breast reconstruction: part II. A cost analysis.
Jansen, Leigh A; Macadam, Sheina A
2011-06-01
Increasingly, AlloDerm is being used in alloplastic breast reconstruction, and has been the subject of a recent systematic review. The authors' objective was to perform a cost analysis comparing direct-to-implant with AlloDerm reconstruction to two-stage non-AlloDerm reconstruction. Seven clinically important health outcomes and their probabilities for both types of reconstruction were derived from the recent review. A decision analytic model from the Canadian provincial payer's perspective was constructed based on these health states. Direct medical costs were estimated from a university-based hospital, yielding expected costs for direct-to-implant reconstruction with AlloDerm and two-stage non-AlloDerm reconstruction. Sensitivity analyses were conducted. Baseline and expected costs were calculated for direct-to-implant AlloDerm and two-stage non-AlloDerm reconstruction. Direct-to-implant reconstruction with AlloDerm was found to be less expensive in the baseline ($10,240 versus $10,584) and expected cost ($10,734 versus $11,251) using a 6 × 16-cm AlloDerm sheet. With a 6 × 12-cm sheet, expected cost falls to $9673. By increasing direct-to-implant operative time from 2 hours to 2.5 hours, expected cost rises to $11,784. If capsular contracture rate requiring revision is set at 15 percent for both procedures, expected costs are $10,926 and $11,251 for direct-to-implant and two-stage procedures, respectively. If the capsular contracture rate is lowered for either procedure, this has minimal impact on expected cost. Although AlloDerm is expensive, it appears to be cost-effective if used for direct-to-implant breast reconstruction. The methods used here may be extrapolated to different centers incorporating local costs and complication rates. A formal randomized controlled trial, including costs, is recommended.
Taksler, Glen B; Perzynski, Adam T; Kattan, Michael W
2017-04-01
Recommendations for colorectal cancer screening encourage patients to choose among various screening methods based on individual preferences for benefits, risks, screening frequency, and discomfort. We devised a model to illustrate how individuals with varying tolerance for screening complications risk might decide on their preferred screening strategy. We developed a discrete-time Markov mathematical model that allowed hypothetical individuals to maximize expected lifetime utility by selecting screening method, start age, stop age, and frequency. Individuals could choose from stool-based testing every 1 to 3 years, flexible sigmoidoscopy every 1 to 20 years with annual stool-based testing, colonoscopy every 1 to 20 years, or no screening. We compared the life expectancy gained from the chosen strategy with the life expectancy available from a benchmark strategy of decennial colonoscopy. For an individual at average risk of colorectal cancer who was risk neutral with respect to screening complications (and therefore was willing to undergo screening if it would actuarially increase life expectancy), the model predicted that he or she would choose colonoscopy every 10 years, from age 53 to 73 years, consistent with national guidelines. For a similar individual who was moderately averse to screening complications risk (and therefore required a greater increase in life expectancy to accept potential risks of colonoscopy), the model predicted that he or she would prefer flexible sigmoidoscopy every 12 years with annual stool-based testing, with 93% of the life expectancy benefit of decennial colonoscopy. For an individual with higher risk aversion, the model predicted that he or she would prefer 2 lifetime flexible sigmoidoscopies, 20 years apart, with 70% of the life expectancy benefit of decennial colonoscopy. Mathematical models may formalize how individuals with different risk attitudes choose between various guideline-recommended colorectal cancer screening strategies.
TH-E-201-00: Teaching Radiology Residents: What, How, and Expectation
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The ABR Core Examination stresses integrating physics into real-world clinical practice and, accordingly, has shifted its focus from passive recall of facts to active application of physics principles. Physics education of radiology residents poses a challenge. The traditional method of didactic lectures alone is insufficient, yet it is difficult to incorporate physics teaching consistently into clinical rotations due to time constraints. Faced with this challenge, diagnostic medical physicists who teach radiology residents, have been thinking about how to adapt their teaching to the new paradigm, what to teach and meet expectation of the radiology resident and the radiology residency program.more » The proposed lecture attempts to discuss above questions. Newly developed diagnostic radiology residents physics curriculum by the AAPM Imaging Physics Curricula Subcommittee will be reviewed. Initial experience on hands-on physics teaching will be discussed. Radiology resident who will have taken the BAR Core Examination will share the expectation of physics teaching from a resident perspective. The lecture will help develop robust educational approaches to prepare radiology residents for safer and more effective lifelong practice. Learning Objectives: Learn updated physics requirements for radiology residents Pursue effective approaches to teach physics to radiology residents Learn expectation of physics teaching from resident perspective J. Zhang, This topic is partially supported by RSNA Education Scholar Grant.« less
Murty, Vishnu P.; Adcock, R. Alison
2014-01-01
Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expectancy violations. During functional magnetic resonance imaging, participants performed a reaction-time task in which goal-irrelevant expectancy violations were encountered during states of high- or low-reward motivation. Motivation amplified hippocampal activation to and declarative memory for expectancy violations. Connectivity of the ventral tegmental area (VTA) with medial prefrontal, ventrolateral prefrontal, and visual cortices preceded and predicted this increase in hippocampal sensitivity. These findings elucidate a novel mechanism whereby reward motivation can enhance hippocampus-dependent memory: anticipatory VTA-cortical–hippocampal interactions. Further, the findings integrate literatures on dopaminergic neuromodulation of prefrontal function and hippocampus-dependent memory. We conclude that during reward motivation, VTA modulation induces distributed neural changes that amplify hippocampal signals and records of expectancy violations to improve predictions—a potentially unique contribution of the hippocampus to reward learning. PMID:23529005
NASA Astrophysics Data System (ADS)
Reed, D. E.; Jones, G.; Heaney, A.
2013-12-01
Retention in the STEM fields is often a focus for higher education due to a shortage of trained workforce members. In particular, much effort has been spent on first year retention rates and introductory level courses under the assumption that students are more likely to drop out of STEM majors early in their higher education degree progress. While the retention rates of women, minorities, and low income students have been a priority by both the National Science Foundation and the private sector, we are interested in at-risk first year students for this study. The University of Wyoming Synergy Program's goal is to promote academic success and retention for underprepared and at-risk students by creating a series of first semester curricula as theme-based college transition skills courses that are paired with English courses. This creates a cohort group of courses for the students with increased communication between instructors at the same time allowing greater development of student social networks. In this study we are highlighting the results of the STEM students as compared with other at-risk participants in the program. The Synergy Program enrolls approximately 144 students each year with pre- and post-course surveys that directly measure which college skills students select as important as well as student expectations of the amount of time required for STEM courses. Follow-up surveys track the same queries for students who persist to their junior and senior year. In addition, instructors complete a summative survey about skills they find important to student success and individual student's challenges and successes with a variety of skills. Our results show a large gap in skills between those identified as important by students and those identified by their instructors. Expectations for the amount of time required to complete work for STEM courses and the reported time spent on course work are not constant when progressing throughout college. This analysis will show other higher education instructors both the course design and results from this study of at-risk students. Our results will include specific strategies for instructors or institutes to enhance STEM retention while increasing the overall college success of at-risk freshmen through this innovative course design.
Escudero-Carretero, María J; Prieto-Rodríguez, Mángeles; Fernández-Fernández, Isabel; March-Cerdá, Joan Carles
2007-12-01
To understand the expectations held by type 1 and 2 diabetes mellitus (DM 1 & 2) patients and their relatives regarding the health-care provided to them. qualitative. Focus groups. Andalusia. A theoretical sample that includes the most characteristic profiles. Thirty-one subjects with DM. segmentation characteristics receiving health-care for DM in Primary or Specialized care, living in urban and rural areas, men and women, age, varying diagnosis times, DM course and consequences. Subjects were recruited by health-care professionals at reference care centres. Patients expect their health-care professionals to be understanding, to treat them with kindness and respect, to have good communication skills, to provide information in a non-authoritarian manner while fully acknowledging patients' know-how. Regarding the health-care system, their expectations focus on the system's ability to respond when required to do so, through a relevant professional, along with readily available equipment for treatment. The expectations of people affected by DM1 focus on leading a normal life and not having their educational, labour, social and family opportunities limited by the disease. Expectations in people with DM2 tend towards avoiding what they know has happened to other patients. 'Facilitating', is a key word. Both the health-care system and its professionals must pay keener attention to the emotional aspects of the disease and its process, adopting a comprehensive approach to care. It is vital that health-care professionals take an active interest in the course of their patient's disease, promoting accessibility and an atmosphere of trust and flexibility.
Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)
NASA Technical Reports Server (NTRS)
DeMott, Diana L.; Bigler, Mark A.
2017-01-01
NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. To determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators, and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules, and operational requirements are developed and then finalized.
Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)
NASA Technical Reports Server (NTRS)
DeMott, Diana; Bigler, Mark
2016-01-01
NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. In order to determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules and operational requirements are developed and then finalized.
Architectural requirements for the Red Storm computing system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camp, William J.; Tomkins, James Lee
This report is based on the Statement of Work (SOW) describing the various requirements for delivering 3 new supercomputer system to Sandia National Laboratories (Sandia) as part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative (ASCI) program. This system is named Red Storm and will be a distributed memory, massively parallel processor (MPP) machine built primarily out of commodity parts. The requirements presented here distill extensive architectural and design experience accumulated over a decade and a half of research, development and production operation of similar machines at Sandia. Red Storm will have an unusually high bandwidth, low latencymore » interconnect, specially designed hardware and software reliability features, a light weight kernel compute node operating system and the ability to rapidly switch major sections of the machine between classified and unclassified computing environments. Particular attention has been paid to architectural balance in the design of Red Storm, and it is therefore expected to achieve an atypically high fraction of its peak speed of 41 TeraOPS on real scientific computing applications. In addition, Red Storm is designed to be upgradeable to many times this initial peak capability while still retaining appropriate balance in key design dimensions. Installation of the Red Storm computer system at Sandia's New Mexico site is planned for 2004, and it is expected that the system will be operated for a minimum of five years following installation.« less
Characterizing deformability and surface friction of cancer cells
Byun, Sangwon; Son, Sungmin; Amodei, Dario; Cermak, Nathan; Shaw, Josephine; Kang, Joon Ho; Hecht, Vivian C.; Winslow, Monte M.; Jacks, Tyler; Mallick, Parag; Manalis, Scott R.
2013-01-01
Metastasis requires the penetration of cancer cells through tight spaces, which is mediated by the physical properties of the cells as well as their interactions with the confined environment. Various microfluidic approaches have been devised to mimic traversal in vitro by measuring the time required for cells to pass through a constriction. Although a cell’s passage time is expected to depend on its deformability, measurements from existing approaches are confounded by a cell's size and its frictional properties with the channel wall. Here, we introduce a device that enables the precise measurement of (i) the size of a single cell, given by its buoyant mass, (ii) the velocity of the cell entering a constricted microchannel (entry velocity), and (iii) the velocity of the cell as it transits through the constriction (transit velocity). Changing the deformability of the cell by perturbing its cytoskeleton primarily alters the entry velocity, whereas changing the surface friction by immobilizing positive charges on the constriction's walls primarily alters the transit velocity, indicating that these parameters can give insight into the factors affecting the passage of each cell. When accounting for cell buoyant mass, we find that cells possessing higher metastatic potential exhibit faster entry velocities than cells with lower metastatic potential. We additionally find that some cell types with higher metastatic potential exhibit greater than expected changes in transit velocities, suggesting that not only the increased deformability but reduced friction may be a factor in enabling invasive cancer cells to efficiently squeeze through tight spaces. PMID:23610435
Theoretical Tools and Software for Modeling, Simulation and Control Design of Rocket Test Facilities
NASA Technical Reports Server (NTRS)
Richter, Hanz
2004-01-01
A rocket test stand and associated subsystems are complex devices whose operation requires that certain preparatory calculations be carried out before a test. In addition, real-time control calculations must be performed during the test, and further calculations are carried out after a test is completed. The latter may be required in order to evaluate if a particular test conformed to specifications. These calculations are used to set valve positions, pressure setpoints, control gains and other operating parameters so that a desired system behavior is obtained and the test can be successfully carried out. Currently, calculations are made in an ad-hoc fashion and involve trial-and-error procedures that may involve activating the system with the sole purpose of finding the correct parameter settings. The goals of this project are to develop mathematical models, control methodologies and associated simulation environments to provide a systematic and comprehensive prediction and real-time control capability. The models and controller designs are expected to be useful in two respects: 1) As a design tool, a model is the only way to determine the effects of design choices without building a prototype, which is, in the context of rocket test stands, impracticable; 2) As a prediction and tuning tool, a good model allows to set system parameters off-line, so that the expected system response conforms to specifications. This includes the setting of physical parameters, such as valve positions, and the configuration and tuning of any feedback controllers in the loop.
A Guide to Computed Tomography System Specifications
1990-08-01
particularly where anomalies are not known or expected, where nonimaging measurements of deviations from a norm defy experience or expectation, or...point. 2.9 Archival Requirements Archival requirements usually involve hardcopy, tape, and/or optical disk. These dictate a small subsystem choice, but...some kind of scintillating X-ray crystal, e.g., cadmium tungstate or bismuth germanate that is optically coupled to a photoconversion device like a
Flight test experience using advanced airborne equipment in a time-based metered traffic environment
NASA Technical Reports Server (NTRS)
Morello, S. A.
1980-01-01
A series of test flights have demonstrated that time-based metering guidance and control was acceptable to pilots and air traffic controllers. The descent algorithm of the technique, with good representation of aircraft performance and wind modeling, yielded arrival time accuracy within 12 sec. It is expected that this will represent significant fuel savings (1) through a reduction of the time error dispersions at the metering fix for the entire fleet, and (2) for individual aircraft as well, through the presentation of guidance for a fuel-efficient descent. Air traffic controller workloads were also reduced, in keeping with the reduction of required communications resulting from the transfer of navigation responsibilities to pilots. A second series of test flights demonstrated that an existing flight management system could be modified to operate in the new mode.
On modeling animal movements using Brownian motion with measurement error.
Pozdnyakov, Vladimir; Meyer, Thomas; Wang, Yu-Bo; Yan, Jun
2014-02-01
Modeling animal movements with Brownian motion (or more generally by a Gaussian process) has a long tradition in ecological studies. The recent Brownian bridge movement model (BBMM), which incorporates measurement errors, has been quickly adopted by ecologists because of its simplicity and tractability. We discuss some nontrivial properties of the discrete-time stochastic process that results from observing a Brownian motion with added normal noise at discrete times. In particular, we demonstrate that the observed sequence of random variables is not Markov. Consequently the expected occupation time between two successively observed locations does not depend on just those two observations; the whole path must be taken into account. Nonetheless, the exact likelihood function of the observed time series remains tractable; it requires only sparse matrix computations. The likelihood-based estimation procedure is described in detail and compared to the BBMM estimation.
Optimizing Utilization of Detectors
2016-03-01
provide a quantifiable process to determine how much time should be allocated to each task sharing the same asset . This optimized expected time... allocation is calculated by numerical analysis and Monte Carlo simulation. Numerical analysis determines the expectation by involving an integral and...determines the optimum time allocation of the asset by repeatedly running experiments to approximate the expectation of the random variables. This
Sheehan, Michael T.; Doi, Suhail A.R.
2016-01-01
Graves’ disease is the most common cause of hyperthyroidism and is often managed with radioactive iodine (RAI) therapy. With current dosing schemes, the vast majority of patients develop permanent post-RAI hypothyroidism and are placed on life-long levothyroxine therapy. This hypothyroidism typically occurs within the first 3 to 6 months after RAI therapy is administered. Indeed, patients are typically told to expect life-long thyroid hormone replacement therapy to be required within this timeframe and many providers expect this post-RAI hypothyroidism to be complete and permanent. There is, however, a small subset of patients in whom a transient post-RAI hypothyroidism develops which, initially, presents exactly as the typical permanent hypothyroidism. In some cases the transient hypothyroidism leads to a period of euthyroidism of variable duration eventually progressing to permanent hypothyroidism. In others, persistent hyperthyroidism requires a second dose of RAI. Failure to appreciate and recognize the possibility of transient post-RAI hypothyroidism can delay optimal and appropriate treatment of the patient. We herein describe five cases of transient post-RAI hypothyroidism which highlight this unusual sequence of events. Increased awareness of this possible outcome after RAI for Graves’ disease will help in the timely management of patients. PMID:26864507
NASA Astrophysics Data System (ADS)
Wibowo, Y. T.; Baskoro, S. Y.; Manurung, V. A. T.
2018-02-01
Plastic based products spread all over the world in many aspects of life. The ability to substitute other materials is getting stronger and wider. The use of plastic materials increases and become unavoidable. Plastic based mass production requires injection process as well Mold. The milling process of plastic mold steel material was done using HSS End Mill cutting tool that is widely used in a small and medium enterprise for the reason of its ability to be re sharpened and relatively inexpensive. Study on the effect of the geometry tool states that it has an important effect on the quality improvement. Cutting speed, feed rate, depth of cut and radii are input parameters beside to the tool path strategy. This paper aims to investigate input parameter and cutting tools behaviors within some different tool path strategy. For the reason of experiments efficiency Taguchi method and ANOVA were used. Response studied is surface roughness and cutting behaviors. By achieving the expected quality, no more additional process is required. Finally, the optimal combination of machining parameters will deliver the expected roughness and of course totally reduced cutting time. However actually, SMEs do not optimally use this data for cost reduction.
Detection performance in clutter with variable resolution
NASA Astrophysics Data System (ADS)
Schmieder, D. E.; Weathersby, M. R.
1983-07-01
Experiments were conducted to determine the influence of background clutter on target detection criteria. The experiment consisted of placing observers in front of displayed images on a TV monitor. Observer ability to detect military targets embedded in simulated natural and manmade background clutter was measured when there was unlimited viewing time. Results were described in terms of detection probability versus target resolution for various signal to clutter ratios (SCR). The experiments were preceded by a search for a meaningful clutter definition. The selected definition was a statistical measure computed by averaging the standard deviation of contiguous scene cells over the whole scene. The cell size was comparable to the target size. Observer test results confirmed the expectation that the resolution required for a given detection probability was a continuum function of the clutter level. At the lower SCRs the resolution required for a high probability of detection was near 6 line pairs per target (LP/TGT), while at the higher SCRs it was found that a resoluton of less than 0.25 LP/TGT would yield a high probability of detection. These results are expected to aid in target acquisition performance modeling and to lead to improved specifications for imaging automatic target screeners.
An Experiment Quantifying The Effect Of Clutter On Target Detection
NASA Astrophysics Data System (ADS)
Weathersby, Marshall R.; Schmieder, David E.
1985-01-01
Experiments were conducted to determine the influence of background clutter on target detection criteria. The experiment consisted of placing observers in front of displayed images on a TV monitor. Observer ability to detect military targets embedded in simulated natural and manmade background clutter was measured when there was unlimited viewing time. Results were described in terms of detection probability versus target resolution for various signal to clutter ratios (SCR). The experiments were preceded by a search for a meaningful clutter definition. The selected definition was a statistical measure computed by averaging the standard deviation of contiguous scene cells over the whole scene. The cell size was comparable to the target size. Observer test results confirmed the expectation that the resolution required for a given detection probability was a continuum function of the clutter level. At the lower SCRs the resolution required for a high probability of detection was near 6 lines pairs per target (LP/TGT), while at the higher SCRs it was found that a resolution of less than 0.25 LP/TGT would yield a high probability of detection. These results are expected to aid in target acquisition performance modeling and to lead to improved specifications for imaging automatic target screeners.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thorpe, J. I.; Livas, J.; Maghami, P.
Arm locking is a proposed laser frequency stabilization technique for the Laser Interferometer Space Antenna (LISA), a gravitational-wave observatory sensitive in the milliHertz frequency band. Arm locking takes advantage of the geometric stability of the triangular constellation of three spacecraft that compose LISA to provide a frequency reference with a stability in the LISA measurement band that exceeds that available from a standard reference such as an optical cavity or molecular absorption line. We have implemented a time-domain simulation of a Kalman-filter-based arm-locking system that includes the expected limiting noise sources as well as the effects of imperfect a priorimore » knowledge of the constellation geometry on which the design is based. We use the simulation to study aspects of the system performance that are difficult to capture in a steady-state frequency-domain analysis such as frequency pulling of the master laser due to errors in estimates of heterodyne frequency. We find that our implementation meets requirements on both the noise and dynamic range of the laser frequency with acceptable tolerances and that the design is sufficiently insensitive to errors in the estimated constellation geometry that the required performance can be maintained for the longest continuous measurement intervals expected for the LISA mission.« less
Machine vision for digital microfluidics
NASA Astrophysics Data System (ADS)
Shin, Yong-Jun; Lee, Jeong-Bong
2010-01-01
Machine vision is widely used in an industrial environment today. It can perform various tasks, such as inspecting and controlling production processes, that may require humanlike intelligence. The importance of imaging technology for biological research or medical diagnosis is greater than ever. For example, fluorescent reporter imaging enables scientists to study the dynamics of gene networks with high spatial and temporal resolution. Such high-throughput imaging is increasingly demanding the use of machine vision for real-time analysis and control. Digital microfluidics is a relatively new technology with expectations of becoming a true lab-on-a-chip platform. Utilizing digital microfluidics, only small amounts of biological samples are required and the experimental procedures can be automatically controlled. There is a strong need for the development of a digital microfluidics system integrated with machine vision for innovative biological research today. In this paper, we show how machine vision can be applied to digital microfluidics by demonstrating two applications: machine vision-based measurement of the kinetics of biomolecular interactions and machine vision-based droplet motion control. It is expected that digital microfluidics-based machine vision system will add intelligence and automation to high-throughput biological imaging in the future.
NASA Astrophysics Data System (ADS)
Smith, L. A.
2007-12-01
We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically adequate; this may help to explain the reluctance of experts to provide information on "parameter uncertainty." Probability statements about the real world are always conditioned on some information set; they may well be conditioned on "False" making them of little value to a rational decision maker. In other instances, they may be conditioned on physical assumptions not held by any of the modellers whose model output is being cast as a probability distribution. Our models will improve a great deal in the next decades, and our insight into the likely climate fifty years hence will improve: maintaining the credibility of the science and the coherence of science based decision support, as our models improve, require a clear statement of our current limitations. What evidence do we have that today's state-of-the-art models provide decision-relevant probability forecasts? What space and time scales do we currently have quantitative, decision-relevant information on for 2050? 2080?
NASA Astrophysics Data System (ADS)
Stockert, Sven; Wehr, Matthias; Lohmar, Johannes; Abel, Dirk; Hirt, Gerhard
2017-10-01
In the electrical and medical industries the trend towards further miniaturization of devices is accompanied by the demand for smaller manufacturing tolerances. Such industries use a plentitude of small and narrow cold rolled metal strips with high thickness accuracy. Conventional rolling mills can hardly achieve further improvement of these tolerances. However, a model-based controller in combination with an additional piezoelectric actuator for high dynamic roll adjustment is expected to enable the production of the required metal strips with a thickness tolerance of +/-1 µm. The model-based controller has to be based on a rolling theory which can describe the rolling process very accurately. Additionally, the required computing time has to be low in order to predict the rolling process in real-time. In this work, four rolling theories from literature with different levels of complexity are tested for their suitability for the predictive controller. Rolling theories of von Kármán, Siebel, Bland & Ford and Alexander are implemented in Matlab and afterwards transferred to the real-time computer used for the controller. The prediction accuracy of these theories is validated using rolling trials with different thickness reduction and a comparison to the calculated results. Furthermore, the required computing time on the real-time computer is measured. Adequate results according the prediction accuracy can be achieved with the rolling theories developed by Bland & Ford and Alexander. A comparison of the computing time of those two theories reveals that Alexander's theory exceeds the sample rate of 1 kHz of the real-time computer.
Det Norske Veritas rule philosophy with regard to gas turbines for marine propulsion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, P.
1999-04-01
This paper is mainly based on Det Norske Veritas (DNV) Rules of January 1996, Part 4, Chapter 2, Section 4 -- Gas Turbines, and is intended to at least open the dialogue between the gas turbine industry and DNV. There is a need for design approval and manufacturing inspection process systematic and testing procedures to match the standards of the industry. The role and expectations imposed by owners, the authorities, insurance agencies, etc. needs to be understood. These expectations often have technical implications that may go against the normal procedures and practices of the gas turbine industry, and could havemore » cost impacts. The question of DNV acceptance criteria has been asked many times, with respect to gas turbines. DNV relies a great deal on the manufacturer to provide the basis for the design criteria, manufacturing, and testing criteria of the gas turbine. However, DNV adds its knowledge and experience to this, and checks that the documentation presented by the manufacturer is technically acceptable. Generally, a high level of the state-of-the-art theoretical documentation is required to support the design of modern gas turbines. A proper understanding of the rule philosophy of DNV could prove to be useful in developing better gas turbines systems, which fulfill the rule requirements, and at the same time save resources such as money and time. It is important for gas turbine manufacturers to understand the intent of the rules since it is the intent that needs to be fulfilled. Further, the rules do have the principle of equivalence, which means that there is full freedom in how one fulfills the intent of the rules, as long as DNV accepts the solution.« less
Hérivaux, Cécile; Orban, Philippe; Brouyère, Serge
2013-10-15
In Europe, 30% of groundwater bodies are considered to be at risk of not achieving the Water Framework Directive (WFD) 'good status' objective by 2015, and 45% are in doubt of doing so. Diffuse agricultural pollution is one of the main pressures affecting groundwater bodies. To tackle this problem, the WFD requires Member States to design and implement cost-effective programs of measures to achieve the 'good status' objective by 2027 at the latest. Hitherto, action plans have mainly consisted of promoting the adoption of Agri-Environmental Schemes (AES). This raises a number of questions concerning the effectiveness of such schemes for improving groundwater status, and the economic implications of their implementation. We propose a hydro-economic model that combines a hydrogeological model to simulate groundwater quality evolution with agronomic and economic components to assess the expected costs, effectiveness, and benefits of AES implementation. This hydro-economic model can be used to identify cost-effective AES combinations at groundwater-body scale and to show the benefits to be expected from the resulting improvement in groundwater quality. The model is applied here to a rural area encompassing the Hesbaye aquifer, a large chalk aquifer which supplies about 230,000 inhabitants in the city of Liege (Belgium) and is severely contaminated by agricultural nitrates. We show that the time frame within which improvements in the Hesbaye groundwater quality can be expected may be much longer than that required by the WFD. Current WFD programs based on AES may be inappropriate for achieving the 'good status' objective in the most productive agricultural areas, in particular because these schemes are insufficiently attractive. Achieving 'good status' by 2027 would demand a substantial change in the design of AES, involving costs that may not be offset by benefits in the case of chalk aquifers with long renewal times. Copyright © 2013 Elsevier Ltd. All rights reserved.
Proxy-equation paradigm: A strategy for massively parallel asynchronous computations
NASA Astrophysics Data System (ADS)
Mittal, Ankita; Girimaji, Sharath
2017-09-01
Massively parallel simulations of transport equation systems call for a paradigm change in algorithm development to achieve efficient scalability. Traditional approaches require time synchronization of processing elements (PEs), which severely restricts scalability. Relaxing synchronization requirement introduces error and slows down convergence. In this paper, we propose and develop a novel "proxy equation" concept for a general transport equation that (i) tolerates asynchrony with minimal added error, (ii) preserves convergence order and thus, (iii) expected to scale efficiently on massively parallel machines. The central idea is to modify a priori the transport equation at the PE boundaries to offset asynchrony errors. Proof-of-concept computations are performed using a one-dimensional advection (convection) diffusion equation. The results demonstrate the promise and advantages of the present strategy.
Repeat-until-success cubic phase gate for universal continuous-variable quantum computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, Kevin; Pooser, Raphael; Siopsis, George
2015-03-24
We report that to achieve universal quantum computation using continuous variables, one needs to jump out of the set of Gaussian operations and have a non-Gaussian element, such as the cubic phase gate. However, such a gate is currently very difficult to implement in practice. Here we introduce an experimentally viable “repeat-until-success” approach to generating the cubic phase gate, which is achieved using sequential photon subtractions and Gaussian operations. Ultimately, we find that our scheme offers benefits in terms of the expected time until success, as well as the fact that we do not require any complex off-line resource state,more » although we require a primitive quantum memory.« less
TOPEX/POSEIDON orbit maintenance maneuver design
NASA Technical Reports Server (NTRS)
Bhat, R. S.; Frauenholz, R. B.; Cannell, Patrick E.
1990-01-01
The Ocean Topography Experiment (TOPEX/POSEIDON) mission orbit requirements are outlined, as well as its control and maneuver spacing requirements including longitude and time targeting. A ground-track prediction model dealing with geopotential, luni-solar gravity, and atmospheric-drag perturbations is considered. Targeting with all modeled perturbations is discussed, and such ground-track prediction errors as initial semimajor axis, orbit-determination, maneuver-execution, and atmospheric-density modeling errors are assessed. A longitude targeting strategy for two extreme situations is investigated employing all modeled perturbations and prediction errors. It is concluded that atmospheric-drag modeling errors are the prevailing ground-track prediction error source early in the mission during high solar flux, and that low solar-flux levels expected late in the experiment stipulate smaller maneuver magnitudes.
[Optimization of the pseudorandom input signals used for the forced oscillation technique].
Liu, Xiaoli; Zhang, Nan; Liang, Hong; Zhang, Zhengbo; Li, Deyu; Wang, Weidong
2017-10-01
The forced oscillation technique (FOT) is an active pulmonary function measurement technique that was applied to identify the mechanical properties of the respiratory system using external excitation signals. FOT commonly includes single frequency sine, pseudorandom and periodic impulse excitation signals. Aiming at preventing the time-domain amplitude overshoot that might exist in the acquisition of combined multi sinusoidal pseudorandom signals, this paper studied the phase optimization of pseudorandom signals. We tried two methods including the random phase combination and time-frequency domain swapping algorithm to solve this problem, and used the crest factor to estimate the effect of optimization. Furthermore, in order to make the pseudorandom signals met the requirement of the respiratory system identification in 4-40 Hz, we compensated the input signals' amplitudes at the low frequency band (4-18 Hz) according to the frequency-response curve of the oscillation unit. Resuts showed that time-frequency domain swapping algorithm could effectively optimize the phase combination of pseudorandom signals. Moreover, when the amplitudes at low frequencies were compensated, the expected stimulus signals which met the performance requirements were obtained eventually.
Psychology Students' Expectations Regarding Educational Requirements and Salary for Desired Careers
ERIC Educational Resources Information Center
Strapp, Chehalis M.; Drapela, Danica J.; Henderson, Cierra I.; Nasciemento, Emily; Roscoe, Lauren J.
2018-01-01
This study investigated the accuracy of psychology majors' expectations regarding careers. Psychology majors, including 101 women and 35 men (M[subscript age] = 23 years; standard deviation[subscript age] = 6.25), indicated a desired career and estimated the level of education needed and the expected annual salary for the career. Students'…
Custom large scale integrated circuits for spaceborne SAR processors
NASA Technical Reports Server (NTRS)
Tyree, V. C.
1978-01-01
The application of modern LSI technology to the development of a time-domain azimuth correlator for SAR processing is discussed. General design requirements for azimuth correlators for missions such as SEASAT-A, Venus orbital imaging radar (VOIR), and shuttle imaging radar (SIR) are summarized. Several azimuth correlator architectures that are suitable for implementation using custom LSI devices are described. Technical factors pertaining to selection of appropriate LSI technologies are discussed, and the maturity of alternative technologies for spacecraft applications are reported in the context of expected space mission launch dates. The preliminary design of a custom LSI time-domain azimuth correlator device (ACD) being developed for use in future SAR processors is detailed.
A Methodology for Successful MIS Projects
Jacobs, Patt
1988-01-01
St. Vincent Hospital and Medical Center (SVHMC) was one of the Pacific Northwests' first hospitals to install a Medical Information System (MIS). In mid 1985 the hospital was confronted with the fact that vendor support of its MIS would be withdrawn in the near future. In 21 months a complete, fully operational MIS had to be selected, installed, and implemented. MIS projects are large, complicated sets of interrelated tasks organized to achieve a specific goal for an organization. Project management requires a sound methodology to deliver proper and necessary functions, on time, within budget, and that meet user expectations. This paper focuses on the process used to deliver the SIMON System on time and 15% under budget.
Sanford, Joseph A; Kadry, Bassam; Oakes, Daryl; Macario, Alex; Schmiesing, Cliff
2016-04-15
Although transesophageal echocardiography is routinely performed at our institution, there is no easy way to document the procedure in the electronic medical record and generate a bill compliant with reimbursement requirements. We present the results of a quality improvement project that used agile development methodology to incorporate intraoperative transesophageal echocardiography into the electronic medical record. We discuss improvements in the quality of clinical documentation, technical workflow challenges overcome, and cost and time to return on investment. Billing was increased from an average of 36% to 84.6% when compared with the same time period in the previous year. The expected recoupment of investment for this project is just 18 weeks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voyles, Jimmy
Individual datastreams from instrumentation at the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility fixed and mobile research observatories (sites) are collected and routed to the ARM Data Center (ADC). The Data Management Facility (DMF), a component of the ADC, executes datastream processing in near-real time. Processed data are then delivered approximately daily to the ARM Data Archive, also a component of the ADC, where they are made freely available to the research community. For each instrument, ARM calculates the ratio of the actual number of processed data records received daily at the ARM Data Archivemore » to the expected number of data records. DOE requires national user facilities to report time-based operating data.« less
NASA Technical Reports Server (NTRS)
Flock, W. L.
1981-01-01
When high precision is required for range measurement on Earth space paths, it is necessary to correct as accurately as possible for excess range delays due to the dry air, water vapor, and liquid water content of the atmosphere. Calculations based on representative values of atmospheric parameters are useful for illustrating the order of magnitude of the expected delays. Range delay, time delay, and phase delay are simply and directly related. Doppler frequency variations or noise are proportional to the time rate of change of excess range delay. Tropospheric effects were examined as part of an overall consideration of the capability of precision two way ranging and Doppler systems.
Climate change impact on growing degree day accumulation values
NASA Astrophysics Data System (ADS)
Bekere, Liga; Sile, Tija; Bethers, Uldis; Sennikovs, Juris
2015-04-01
A well-known and often used method to assess and forecast plant growth cycle is the growing degree day (GDD) method with different formulas used for accumulation calculations. With this method the only factor that affects plant development is temperature. So with climate change and therefore also change in temperature the typical times of plant blooming or harvest can be expected to change. The goal of this study is to assess this change in the Northern Europe region. As an example strawberry bloom and harvest times are used. As the first part of this study it was required to define the current GDD amounts required for strawberry bloom and harvest. It was done using temperature data from the Danish Meteorological Institute's (DMI) NWP model HIRLAM for the years 2010-2012 and general strawberry growth observations in Latvia. This way we acquired an example amount of GDD required for strawberry blooming and harvest. To assess change in the plant growth cycle we used regional climate models (RCM) - Euro-CORDEX. RCM temperature data for both past and future periods was analyzed and bias correction was carried out. Then the GDD calculation methodology was applied on corrected temperature data and results showing change in strawberry growth cycle - bloom and harvest times - in Northern Europe were visualized.
Pressurization and expulsion of a flightweight liquid hydrogen tank
NASA Technical Reports Server (NTRS)
Vandresar, N. T.; Stochl, R. J.
1993-01-01
Experimental results are presented for pressurization and expulsion of a flight-weight 4.89 cu m liquid hydrogen storage tank under normal gravity conditions. Pressurization and expulsion times are parametrically varied to study the effects of longer transfer times expected in future space flight applications. It is found that the increase in pressurant consumption with increased operational time is significant at shorter pressurization or expulsion durations and diminishes as the duration lengthens. Gas-to-wall heat transfer in the ullage is the dominant mode of energy exchange, with more than 50 percent of the pressurant energy being lost to tank wall heating in expulsions and the long duration pressurizations. Advanced data analysis will require a multidimensional approach combined with improved measurement capabilities of liquid-vapor interfacial transport phenomena.
Return to normal activities and work after living donor laparoscopic nephrectomy.
Larson, Dawn B; Jacobs, Cheryl; Berglund, Danielle; Wiseman, Jennifer; Garvey, Catherine; Gillingham, Kristen; Ibrahim, Hassan N; Matas, Arthur J
2017-01-01
Transplant programs inform potential donors that they should be able to return to normal activities within ~2 weeks and to work by 6 weeks after laparoscopic nephrectomy. We studied actual time. Between 10/2004 and 9/2014, 911 donors having laparoscopic nephrectomy were surveyed 6 months post-donation. Surveys asked questions specific to their recovery experience, including time to return to normal activities and work and a description of their recovery time relative to pre-donation expectations. Of the 911, 646 (71%) responded: mean age at donation was 43.5±10.6 years; 65% were female, 95% were white, 51% were biologically related to their recipient, and 83% reported education beyond high school. Of the 646 respondents, a total of 35% returned to normal activities by 2 weeks post-donation; 79% by 4 weeks post-donation; 94% by 5-6 weeks; however, 6% took >6 weeks. Of the 646, 551 (85%) were working for pay; of these, mean time to return to work was 5.3±2.8 weeks; median, 5 weeks. Of the 551, a total of 14% returned to work in 1-2 weeks, 46% by 3-4 weeks, and 76% by 5-6 weeks. Importantly, 24% required >6 weeks before returning to work with the highest rates for donors in manual labor or a skilled trade. Significantly longer return to work was reported by females (vs males; P=.01), those without (vs those with) post-high school education (P=.010, those with longer hospital stay (P=.01), and those with a postoperative complication (P=.02). Of respondents, 37% described their recovery time as longer than expected. During the donor informed consent process, additional emphasis on realistic expectations around recovery to baseline activities and return to work is warranted. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Diamanti, Vassiliki; Mouzaki, Angeliki; Ralli, Asimina; Antoniou, Faye; Papaioannou, Sofia; Protopapas, Athanassios
2017-01-01
Different language skills are considered fundamental for successful reading and spelling acquisition. Extensive evidence has highlighted the central role of phonological awareness in early literacy experiences. However, many orthographic systems also require the contribution of morphological awareness. The goal of this study was to examine the morphological and phonological awareness skills of preschool children as longitudinal predictors of reading and spelling ability by the end of first grade, controlling for the effects of receptive and expressive vocabulary skills. At Time 1 preschool children from kindergartens in the Greek regions of Attika, Crete, Macedonia, and Thessaly were assessed on tasks tapping receptive and expressive vocabulary, phonological awareness (syllable and phoneme), and morphological awareness (inflectional and derivational). Tasks were administered through an Android application for mobile devices (tablets) featuring automatic application of ceiling rules. At Time 2 one year later the same children attending first grade were assessed on measures of word and pseudoword reading, text reading fluency, text reading comprehension, and spelling. Complete data from 104 children are available. Hierarchical linear regression and commonality analyses were conducted for each outcome variable. Reading accuracy for both words and pseudowords was predicted not only by phonological awareness, as expected, but also by morphological awareness, suggesting that understanding the functional role of word parts supports the developing phonology-orthography mappings. However, only phonological awareness predicted text reading fluency at this age. Longitudinal prediction of reading comprehension by both receptive vocabulary and morphological awareness was already evident at this age, as expected. Finally, spelling was predicted by preschool phonological awareness, as expected, as well as by morphological awareness, the contribution of which is expected to increase due to the spelling demands of Greek inflectional and derivational suffixes introduced at later grades.
Synchronization of spontaneous eyeblinks while viewing video stories
Nakano, Tamami; Yamamoto, Yoshiharu; Kitajo, Keiichi; Takahashi, Toshimitsu; Kitazawa, Shigeru
2009-01-01
Blinks are generally suppressed during a task that requires visual attention and tend to occur immediately before or after the task when the timing of its onset and offset are explicitly given. During the viewing of video stories, blinks are expected to occur at explicit breaks such as scene changes. However, given that the scene length is unpredictable, there should also be appropriate timing for blinking within a scene to prevent temporal loss of critical visual information. Here, we show that spontaneous blinks were highly synchronized between and within subjects when they viewed the same short video stories, but were not explicitly tied to the scene breaks. Synchronized blinks occurred during scenes that required less attention such as at the conclusion of an action, during the absence of the main character, during a long shot and during repeated presentations of a similar scene. In contrast, blink synchronization was not observed when subjects viewed a background video or when they listened to a story read aloud. The results suggest that humans share a mechanism for controlling the timing of blinks that searches for an implicit timing that is appropriate to minimize the chance of losing critical information while viewing a stream of visual events. PMID:19640888
Modelling approaches: the case of schizophrenia.
Heeg, Bart M S; Damen, Joep; Buskens, Erik; Caleo, Sue; de Charro, Frank; van Hout, Ben A
2008-01-01
Schizophrenia is a chronic disease characterized by periods of relative stability interrupted by acute episodes (or relapses). The course of the disease may vary considerably between patients. Patient histories show considerable inter- and even intra-individual variability. We provide a critical assessment of the advantages and disadvantages of three modelling techniques that have been used in schizophrenia: decision trees, (cohort and micro-simulation) Markov models and discrete event simulation models. These modelling techniques are compared in terms of building time, data requirements, medico-scientific experience, simulation time, clinical representation, and their ability to deal with patient heterogeneity, the timing of events, prior events, patient interaction, interaction between co-variates and variability (first-order uncertainty). We note that, depending on the research question, the optimal modelling approach should be selected based on the expected differences between the comparators, the number of co-variates, the number of patient subgroups, the interactions between co-variates, and simulation time. Finally, it is argued that in case micro-simulation is required for the cost-effectiveness analysis of schizophrenia treatments, a discrete event simulation model is best suited to accurately capture all of the relevant interdependencies in this chronic, highly heterogeneous disease with limited long-term follow-up data.
Vignac, Élie; Lebihain, Pascal; Soulé, Bastien
2017-09-01
In France, to prevent drowning accidents in public swimming pools (PSPs), bathing must be constantly supervised by qualified staff. However, fatal drowning regularly occurs in supervised aquatic facilities. A review of the literature shows that human supervision is a complex task. The aim of this research is to fully assess the periods during which supervision is not carried out, or carried out in an inadequate manner. The observations made in 108 French PSPs show that supervision is not carried out 18% of the time and that it is carried out inadequately 33% of the time. The medical literature shows that, in order to expect to survive without after-effects, an immersed victim requires intervention within a time limit of not more than three minutes; however, we noted, over a total observation time of 54 hours, 147 periods (29.8%) during which the supervision system was degraded for three minutes or more. This quantification research on the periods of degraded supervision is complemented by an identification of the causes leading to these degradations, from which we can draw interesting areas for improvement, particularly from an organizational point of view, in order to improve safety management in French PSPs.
Frontend electronics for high-precision single photo-electron timing using FPGA-TDCs
NASA Astrophysics Data System (ADS)
Cardinali, M.; Dzyhgadlo, R.; Gerhardt, A.; Götzen, K.; Hohler, R.; Kalicy, G.; Kumawat, H.; Lehmann, D.; Lewandowski, B.; Patsyuk, M.; Peters, K.; Schepers, G.; Schmitt, L.; Schwarz, C.; Schwiening, J.; Traxler, M.; Ugur, C.; Zühlsdorf, M.; Dodokhov, V. Kh.; Britting, A.; Eyrich, W.; Lehmann, A.; Uhlig, F.; Düren, M.; Föhl, K.; Hayrapetyan, A.; Kröck, B.; Merle, O.; Rieke, J.; Cowie, E.; Keri, T.; Montgomery, R.; Rosner, G.; Achenbach, P.; Corell, O.; Ferretti Bondy, M. I.; Hoek, M.; Lauth, W.; Rosner, C.; Sfienti, C.; Thiel, M.; Bühler, P.; Gruber, L.; Marton, J.; Suzuki, K.
2014-12-01
The next generation of high-luminosity experiments requires excellent particle identification detectors which calls for Imaging Cherenkov counters with fast electronics to cope with the expected hit rates. A Barrel DIRC will be used in the central region of the Target Spectrometer of the planned PANDA experiment at FAIR. A single photo-electron timing resolution of better than 100 ps is required by the Barrel DIRC to disentangle the complicated patterns created on the image plane. R&D studies have been performed to provide a design based on the TRB3 readout using FPGA-TDCs with a precision better than 20 ps RMS and custom frontend electronics with high-bandwidth pre-amplifiers and fast discriminators. The discriminators also provide time-over-threshold information thus enabling walk corrections to improve the timing resolution. Two types of frontend electronics cards optimised for reading out 64-channel PHOTONIS Planacon MCP-PMTs were tested: one based on the NINO ASIC and the other, called PADIWA, on FPGA discriminators. Promising results were obtained in a full characterisation using a fast laser setup and in a test experiment at MAMI, Mainz, with a small scale DIRC prototype.
Development of a 13 kW Hall Thruster Propulsion System Performance Model for AEPS
NASA Technical Reports Server (NTRS)
Stanley, Steven; Allen, May; Goodfellow, Keith; Chew, Gilbert; Rapetti, Ryan; Tofil, Todd; Herman, Dan; Jackson, Jerry; Myers, Roger
2017-01-01
The Advanced Electric Propulsion System (AEPS) program will develop a flight 13kW Hall thruster propulsion system based on NASA's HERMeS thruster. The AEPS system includes the Hall Thruster, the Power Processing Unit (PPU) and the Xenon Flow Controller (XFC). These three primary components must operate together to ensure that the system generates the required combinations of thrust and specific impulse at the required system efficiencies for the desired system lifetime. At the highest level, the AEPS system will be integrated into the spacecraft and will receive power, propellant, and commands from the spacecraft. Power and propellant flow rates will be determined by the throttle set points commanded by the spacecraft. Within the system, the major control loop is between the mass flow rate and thruster current, with time-dependencies required to handle all expected transients, and additional, much slower interactions between the thruster and cathode temperatures, flow controller and PPU. The internal system interactions generally occur on shorter timescales than the spacecraft interactions, though certain failure modes may require rapid responses from the spacecraft. The AEPS system performance model is designed to account for all these interactions in a way that allows evaluation of the sensitivity of the system to expected changes over the planned mission as well as to assess the impacts of normal component and assembly variability during the production phase of the program. This effort describes the plan for the system performance model development, correlation to NASA test data, and how the model will be used to evaluate the critical internal and external interactions. The results will ensure the component requirements do not unnecessarily drive the system cost or overly constrain the development program. Finally, the model will be available to quickly troubleshoot any future unforeseen development challenges.
NASA Technical Reports Server (NTRS)
Lemoine, F. G.; Zelensky, N. P.; Luthcke, S. B.; Rowlands, D. D.; Beckley, B. D.; Klosko, S. M.
2006-01-01
Launched in the summer of 1992, TOPEX/POSEIDON (T/P) was a joint mission between NASA and the Centre National d Etudes Spatiales (CNES), the French Space Agency, to make precise radar altimeter measurements of the ocean surface. After the remarkably successful 13-years of mapping the ocean surface T/P lost its ability to maneuver and was de-commissioned January 2006. T/P revolutionized the study of the Earth s oceans by vastly exceeding pre-launch estimates of surface height accuracy recoverable from radar altimeter measurements. The precision orbit lies at the heart of the altimeter measurement providing the reference frame from which the radar altimeter measurements are made. The expected quality of orbit knowledge had limited the measurement accuracy expectations of past altimeter missions, and still remains a major component in the error budget of all altimeter missions. This paper describes critical improvements made to the T/P orbit time series over the 13-years of precise orbit determination (POD) provided by the GSFC Space Geodesy Laboratory. The POD improvements from the pre-launch T/P expectation of radial orbit accuracy and Mission requirement of 13-cm to an expected accuracy of about 1.5-cm with today s latest orbits will be discussed. The latest orbits with 1.5 cm RMS radial accuracy represent a significant improvement to the 2.0-cm accuracy orbits currently available on the T/P Geophysical Data Record (GDR) altimeter product.
A New Proof of the Expected Frequency Spectrum under the Standard Neutral Model.
Hudson, Richard R
2015-01-01
The sample frequency spectrum is an informative and frequently employed approach for summarizing DNA variation data. Under the standard neutral model the expectation of the sample frequency spectrum has been derived by at least two distinct approaches. One relies on using results from diffusion approximations to the Wright-Fisher Model. The other is based on Pólya urn models that correspond to the standard coalescent model. A new proof of the expected frequency spectrum is presented here. It is a proof by induction and does not require diffusion results and does not require the somewhat complex sums and combinatorics of the derivations based on urn models.
Mapping emotions through time: how affective trajectories inform the language of emotion.
Kirkland, Tabitha; Cunningham, William A
2012-04-01
The words used to describe emotions can provide insight into the basic processes that contribute to emotional experience. We propose that emotions arise partly from interacting evaluations of one's current affective state, previous affective state, predictions for how these may change in the future, and the experienced outcomes following these predictions. These states can be represented and inferred from neural systems that encode shifts in outcomes and make predictions. In two studies, we demonstrate that emotion labels are reliably differentiated from one another using only simple cues about these affective trajectories through time. For example, when a worse-than-expected outcome follows the prediction that something good will happen, that situation is labeled as causing anger, whereas when a worse-than-expected outcome follows the prediction that something bad will happen, that situation is labeled as causing sadness. Emotion categories are more differentiated when participants are required to think categorically than when participants have the option to consider multiple emotions and degrees of emotions. This work indicates that information about affective movement through time and changes in affective trajectory may be a fundamental aspect of emotion categories. Future studies of emotion must account for the dynamic way that we absorb and process information. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
The calorimeter of the Mu2e experiment at Fermilab
Atanov, N.; Baranov, V.; Budagov, J.; ...
2017-01-23
Here, the Mu2e experiment at Fermilab looks for Charged Lepton Flavor Violation (CLFV) improving by 4 orders of magnitude the current experimental sensitivity for the muon to electron conversion in a muonic atom. A positive signal could not be explained in the framework of the current Standard Model of particle interactions and therefore would be a clear indication of new physics. In 3 years of data taking, Mu2e is expected to observe less than one background event mimicking the electron coming from muon conversion. Achieving such a level of background suppression requires a deep knowledge of the experimental apparatus: amore » straw tube tracker, measuring the electron momentum and time, a cosmic ray veto system rejecting most of cosmic ray background and a pure CsI crystal calorimeter, that will measure time of flight, energy and impact position of the converted electron. The calorimeter has to operate in a harsh radiation environment, in a 10 -4 Torr vacuum and inside a 1 T magnetic field. The results of the first qualification tests of the calorimeter components are reported together with the energy and time performances expected from the simulation and measured in beam tests of a small scale prototype.« less
Healthy life expectancy of oral squamous cell carcinoma patients aged 75years and older.
Yamada, Shin-Ichi; Kurita, Hiroshi; Tomioka, Takahiro; Ohta, Ryousuke; Yoshimura, Nobuhiko; Nishimaki, Fumihiro; Koyama, Yoshihito; Kondo, Eiji; Kamata, Takahiro
2017-01-01
Healthy life expectancy, an extension of the concept of life expectancy, is a summary measure of population health that takes into account the mortality and morbidity of a population. The aim of the present study was to retrospectively analyze the self-reliance survival times of oral squamous cell carcinoma (OSCC) patients. One hundred and twelve patients aged 75years or older with primary OSCC were included and examined at Shinshu University Hospital. To investigate healthy life expectancy, OSCC patients older than 75years were divided into 3 groups: 75-79, 80-84, and older than 85years. The Kaplan-Meier method was used to estimate the median times of healthy life expectancy. The Log-rank test was used to test significant differences between actual curves. The median self-reliance survival times of patients aged 75-79, 80-84, and older than 85years were 5.7, 1.6, and 1.4years, respectively. Most patients with early stage cancers underwent curative treatments and showed a health expectancy of more than 5years. In patients with advanced cancers, health expectancy was poor (less than one year), except among patients aged 75-79years who underwent standard treatments. It seems that in patients with advanced cancers, health expectancy was poor (less than 1year), except among patients aged 75-79years who underwent standard treatments. In elderly patients, healthy life expectancy (self-reliance survival time) may be one of the measures of patient prognosis as well as overall survival times. Copyright © 2016 Elsevier Ltd. All rights reserved.
Functional relationship-based alarm processing
Corsberg, D.R.
1987-04-13
A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated. Thus, each alarm's importance is continuously updated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on causal factors between two alarms); (3) required action (system response or action expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). 11 figs.
The American nuclear construction craftsmen: Will we be ready to build again
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravo, R.
1990-01-01
The present state of nuclear plant maintenance and operations support reflects sexual, ethnic, and radical integration; continued educational advances; some computer literacy; mixed trades in maintenance; detailed training for maintenance and operations work in the operating plant; plant safety awareness and respect; need for top-quality, take the time to do it right mentality; and planning. With no new nuclear construction, what will be the specific talents, focus, and contributions that the craftsmen can be expected to bring to the project To be prepared to successfully manage the next generation of nuclear plant construction, the industry must be acutely aware ofmore » the needs of the labor pool. To be aware of the needs requires an intimate knowledge of the present state of the craft talent, the changed expectations of their contributions, and the effects of new technologies, materials, methods, and individuals that will be used to design and build.« less
The BetaCage: Ultrasensitive Screener for Radioactive Backgrounds
NASA Astrophysics Data System (ADS)
Thompson, Michael; BetaCage Collaboration
2017-09-01
Rare event searches, such as dark matter detection and neutrinoless double beta decay, require screening of materials for backgrounds such as beta emission and alpha decaying isotopes. The BetaCage is a proposed ultra-sensitive time-projection chamber to screen for alpha-emitting and low energy beta-emitting (10-200 keV) contaminants. The expected sensitivity is 0.1 beta particles (perkeV -m2 - day) and 0.1 alpha particles (perm2 - day) , where the former will be limited by Compton scattering of external photons in the screening samples and the latter is expected to be signal-limited. The prototype BetaCage under commissioning at South Dakota School of Mines & Technology is filled with P10 gas (10% methane, 90% argon) in place of neon and is 40×40×20 cm in size. Details on design, construction and characterization will be presented.
Performance analysis of a generalized upset detection procedure
NASA Technical Reports Server (NTRS)
Blough, Douglas M.; Masson, Gerald M.
1987-01-01
A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.
NASA Technical Reports Server (NTRS)
Thomson, J. A. L.; Meng, J. C. S.
1975-01-01
A possible measurement program designed to obtain the information requisite to determining the feasibility of airborne and/or satellite-borne LDV (Laser Doppler Velocimeter) systems is discussed. Measurements made from the ground are favored over an airborne measurement as far as for the purpose of determining feasibility is concerned. The expected signal strengths for scattering at various altitude and elevation angles are examined; it appears that both molecular absorption and ambient turbulence degrade the signal at low elevation angles and effectively constrain the ground based measurement of elevation angles exceeding a critical value. The nature of the wind shear and turbulence to be expected are treated from a linear hydrodynamic model - a mountain lee wave model. The spatial and temporal correlation distances establish requirements on the range resolution, the maximum detectable range and the allowable integration time.
Li, Su-Ting T; Tancredi, Daniel J; Schwartz, Alan; Guillot, Ann; Burke, Ann E; Trimm, R Franklin; Guralnick, Susan; Mahan, John D; Gifford, Kimberly
2018-04-25
The Accreditation Council for Graduate Medical Education requires semiannual Milestone reporting on all residents. Milestone expectations of performance are unknown. Determine pediatric program director (PD) minimum Milestone expectations for residents prior to being ready to supervise and prior to being ready to graduate. Mixed methods survey of pediatric PDs on their programs' Milestone expectations before residents are ready to supervise and before they are ready to graduate, and in what ways PDs use Milestones to make supervision and graduation decisions. If programs had no established Milestone expectations, PDs indicated expectations they considered for use in their program. Mean minimum Milestone level expectations adjusted for program size, region, and clustering of Milestone expectations by program were calculated for prior to supervise and prior to graduate. Free-text questions were analyzed using thematic analysis. The response rate was 56.8% (113/199). Most programs had no required minimum Milestone level before residents are ready to supervise (80%; 76/95) or ready to graduate (84%; 80/95). For readiness to supervise, minimum Milestone expectations PDs considered establishing for their program were highest for humanism (2.46, 95% CI: 2.21-2.71) and professionalization (2.37, 2.15-2.60). Minimum Milestone expectations for graduates were highest for help-seeking (3.14, 2.83-3.46). Main themes included the use of Milestones in combination with other information to assess learner performance and Milestones are not equally weighted when making advancement decisions. Most PDs have not established program minimum Milestones, but would vary such expectations by competency. Copyright © 2018. Published by Elsevier Inc.
Results From The Salt Disposition Project Next Generation Solvent Demonstration Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, T. B.; Fondeur, F. F.; Taylor-Pashow, K. M.L.
2014-04-02
Strip Effluent Hold Tank (SEHT), Decontaminated Salt Solution Hold Tank (DSSHT), Caustic Wash Tank (CWT) and Solvent Hold Tank (SHT) samples were taken throughout the Next Generation Solvent (NGS) Demonstration Plan. These samples were analyzed and the results are reported. SHT: The solvent behaved as expected, with no bulk changes in the composition over time, with the exception of the TOA and TiDG. The TiDG depletion is higher than expected, and consideration must be taken on the required rate of replenishment. Monthly sampling of the SHT is warranted. If possible, additional SHT samples for TiDG analysis (only) would help SRNLmore » refine the TiDG degradation model. CWT: The CWT samples show the expected behavior in terms of bulk chemistry. The 137Cs deposited into the CWT varies somewhat, but generally appears to be lower than during operations with the BOBCalix solvent. While a few minor organic components were noted to be present in the Preliminary sample, at this time these are thought to be artifacts of the sample preparation or may be due to the preceding solvent superwash. DSSHT: The DSSHT samples show the predicted bulk chemistry, although they point towards significant dilution at the front end of the Demonstration. The 137Cs levels in the DSSHT are much lower than during the BOBCalix operations, which is the expected observation. SEHT: The SEHT samples represent the most different output of all four of the outputs from MCU. While the bulk chemistry is as expected, something is causing the pH of the SEHT to be higher than what would be predicted from a pure stream of 0.01 M boric acid. There are several possible different reasons for this, and SRNL is in the process of investigating. Other than the pH issue, the SEHT is as predicted. In summary, the NGS Demonstration Plan samples indicate that the MCU system, with the Blend Solvent, is operating as expected. The only issue of concern regards the pH of the SEHT, and SRNL is in the process of investigating this. SRNL results support the transition to routine operations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dustin, R.
Modernization and renovation of sports facilities challenge the design team to balance a number of requirements: spectator and owner expectations, existing building and site conditions, architectural layouts, code and legislation issues, time constraints and budget issues. System alternatives are evaluated and selected based on the relative priorities of these requirements. These priorities are unique to each project. At Alexander Memorial Coliseum, project schedules, construction funds and facility usage became the priorities. The ACC basketball schedule and arrival of the Centennial Olympics dictated the construction schedule. Initiation and success of the project depended on the commitment of the design team tomore » meet coliseum funding levels established three years ago. Analysis of facility usage and system alternative capabilities drove the design team to select a system that met the project requirements and will maximize the benefits to the owner and spectators for many years to come.« less
Advanced Turbo-Charging Research and Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2008-02-27
The objective of this project is to conduct analysis, design, procurement and test of a high pressure ratio, wide flow range, and high EGR system with two stages of turbocharging. The system needs to meet the stringent 2010MY emissions regulations at 20% + better fuel economy than its nearest gasoline competitor while allowing equivalent vehicle launch characteristics and higher torque capability than its nearest gasoline competitor. The system will also need to meet light truck/ SUV life requirements, which will require validation or development of components traditionally used only in passenger car applications. The conceived system is termed 'seriessequential turbocharger'more » because the turbocharger system operates in series at appropriate times and also sequentially when required. This is accomplished using intelligent design and control of flow passages and valves. Components of the seriessequential system will also be applicable to parallel-sequential systems which are also expected to be in use for future light truck/SUV applications.« less
Pazart, Lionel; Godard-Marceau, Aurélie; Chassagne, Aline; Vivot-Pugin, Aurore; Cretin, Elodie; Amzallag, Edouard; Aubry, Regis
2017-01-01
Background: Ensuring adequate end-of-life care for prisoners is a critical issue. In France, data investigating the impact of laws allowing release of seriously ill prisoners are lacking. Aim: To assess the number and characteristics of prisoners requiring palliative care in French prisons. Design: A prospective, national survey collecting data over a 3-month period. Setting/participants: All healthcare units (n = 190) providing care for prisoners in France. The prison population was 66,698 during the study period. Data collection concerned prisoners requiring end-of-life care, that is, with serious, advanced, progressive, or terminal illness and life expectancy <1 year. Results: Estimated annual prevalence of ill prisoners requiring end-of-life care was 15.2 (confidence interval: 12.5–18.3) per 10,000 prisoners. The observed number of prisoners requiring palliative care (n = 50) was twice as high as the expected age- and sex-standardized number based on the general population and similar to the expected number among persons 10 years older in the free community. In all, 41 of 44 (93%) of identified ill prisoners were eligible for temporary or permanent compassionate release, according to their practitioner. Only 33 of 48 (68%) of ill prisoners requested suspension or reduction in their sentence on medical grounds; half (16/33) received a positive answer. Conclusion: The proportion of prisoners requiring palliative care is higher than expected in the general population. The general frailty and co-existing conditions of prisoners before incarceration and the acceleration of these phenomena in prison could explain this increase in end-of-life situations among prisoners. PMID:28786339
ERIC Educational Resources Information Center
Green, Bill; Bigum, Chris
1993-01-01
It is proposed that evolving technologies and the prevalent media culture are creating a generation of students with very different attitudes, expectations, and capacities. These changes require that society, and education in particular, shift its expectations. (MSE)
Great expectations: teaching ethics to medical students in South Africa.
Behrens, Kevin Gary; Fellingham, Robyn
2014-12-01
Many academic philosophers and ethicists are appointed to teach ethics to medical students. We explore exactly what this task entails. In South Africa the Health Professions Council's curriculum for training medical practitioners requires not only that students be taught to apply ethical theory to issues and be made aware of the legal and regulatory requirements of their profession, it also expects moral formation and the inculcation of professional virtue in students. We explore whether such expectations are reasonable. We defend the claim that physicians ought to be persons of virtuous character, on the grounds of the social contract between society and the profession. We further argue that since the expectations of virtue of health care professionals are reasonable, it is also sound reasoning to expect ethics teachers to try to inculcate such virtues in their students, so far as this is possible. Furthermore, this requires of such teachers that they be suitable role models of ethical practice and virtue, themselves. We claim that this applies to ethics teachers who are themselves not members of the medical profession, too, even though they are not bound by the same social contract as doctors. We conclude that those who accept employment as teachers of ethics to medical students, where as part of their contractual obligation they are expected to inculcate moral values in their students, ought to be prepared to accept their responsibility to be professionally ethical, themselves. © 2013 John Wiley & Sons Ltd.
Adolescents’ Changing Future Expectations Predict the Timing of Adult Role Transitions
Beal, Sarah J.; Crockett, Lisa J.; Peugh, James
2016-01-01
Individual differences in the transition to adulthood are well established. This study examines the extent to which heterogeneity in pathways to adulthood that have been observed in the broader U.S. population are mirrored in adolescents’ expectations regarding when they will experience key adult role transitions (e.g., marriage). Patterns of change in adolescents’ expectations and the relations between their expectations and subsequent role transitions are also explored. Data from 626 youth in Grade 11 (M age = 16), Grade 12, and early adulthood (M age = 23) are analyzed using mover-stayer latent transition analysis. Results indicate three profiles of expected timing, corresponding to youth who anticipate early role entry (i.e., early starters), youth who anticipate earlier entry into employment but no other roles (i.e., employment-focused), and youth who anticipate delays in role transitions favoring increased education (i.e., education-focused). Two-thirds of youths changed their expectations from Grade 11 to 12. Grade 11 and 12 profile membership predicted role transitions in early adulthood. These findings highlight the importance of adolescents’ expectations and changes in expectations across time in shaping entry into adulthood. PMID:27548390
Microwave components for cellular portable radiotelephone
NASA Astrophysics Data System (ADS)
Muraguchi, Masahiro; Aikawa, Masayoshi
1995-09-01
Mobile and personal communication systems are expected to represent a huge market for microwave components in the coming years. A number of components in silicon bipolar, silicon Bi-CMOS, GaAs MESFET, HBT and HEMT are now becoming available for system application. There are tradeoffs among the competing technologies with regard to performance, cost, reliability and time-to-market. This paper describes process selection and requirements of cost and r.f. performances to microwave semiconductor components for digital cellular and cordless telephones. Furthermore, new circuit techniques which were developed by NTT are presented.
Fuselage disbond inspection procedure using pulsed thermography
NASA Astrophysics Data System (ADS)
Ashbaugh, Mike; Thompson, Jeffrey G.
2002-05-01
One use of pulsed thermography that has shown promise in aircraft inspection for some time is an inspection for disbonds in metallic structures. The FAA has funded research at Wayne State University in this area and Boeing identified a specific inspection requirement for disbonds on Boeing 747 aircraft. Laboratory and subsequent field testing monitored by the AANC has demonstrated the reliability of this type of inspection. As a result Boeing expects to approve a general fuselage disbond inspection procedure using pulsed thermography in the 2nd Quarter of 2001.
Optimal reconfiguration strategy for a degradable multimodule computing system
NASA Technical Reports Server (NTRS)
Lee, Yann-Hang; Shin, Kang G.
1987-01-01
The present quantitative approach to the problem of reconfiguring a degradable multimode system assigns some modules to computation and arranges others for reliability. By using expected total reward as the optimal criterion, there emerges an active reconfiguration strategy based not only on the occurrence of failure but the progression of the given mission. This reconfiguration strategy requires specification of the times at which the system should undergo reconfiguration, and the configurations to which the system should change. The optimal reconfiguration problem is converted to integer nonlinear knapsack and fractional programming problems.
ERIC Educational Resources Information Center
Vermont Department of Education, 2004
2004-01-01
Educators from around the state, with the help of The Vermont Institutes, developed Vermont Physical Education Grade Cluster Expectations (GCEs) as a means to identify the physical education content knowledge and skills expected of all students for local assessment required under Act 68. This work was accomplished using the "Vermont's…
40 CFR 80.1449 - What are the Production Outlook Report requirements?
Code of Federal Regulations, 2011 CFR
2011-07-01
... (September 1 for the report due in 2010): (1) The type, or types, of renewable fuel expected to be produced... type of renewable fuel expected to be produced or imported at each facility. (3) The number of RINs expected to be generated by the renewable fuel producer or importer for each type of renewable fuel. (4...
ERIC Educational Resources Information Center
Shockey, Tod L.; Snyder, Karen
2007-01-01
The Maine Learning Results (MLR) expects the state's students in prekindergarten through grade 2 to describe two-dimensional shapes as well as use positional language. Requiring translations of two-dimensional shapes supports this expectation. Students in grades 3-4 are expected to "use transformations," while students in grade 5-8 are…
Effects of Test Expectation on Multiple-Choice Performance and Subjective Ratings
ERIC Educational Resources Information Center
Balch, William R.
2007-01-01
Undergraduates studied the definitions of 16 psychology terms, expecting either a multiple-choice (n = 132) or short-answer (n = 122) test. All students then received the same multiple-choice test, requiring them to recognize the definitions as well as novel examples of the terms. Compared to students expecting a multiple-choice test, those…
Matsubara, Takashi
2017-01-01
Precise spike timing is considered to play a fundamental role in communications and signal processing in biological neural networks. Understanding the mechanism of spike timing adjustment would deepen our understanding of biological systems and enable advanced engineering applications such as efficient computational architectures. However, the biological mechanisms that adjust and maintain spike timing remain unclear. Existing algorithms adopt a supervised approach, which adjusts the axonal conduction delay and synaptic efficacy until the spike timings approximate the desired timings. This study proposes a spike timing-dependent learning model that adjusts the axonal conduction delay and synaptic efficacy in both unsupervised and supervised manners. The proposed learning algorithm approximates the Expectation-Maximization algorithm, and classifies the input data encoded into spatio-temporal spike patterns. Even in the supervised classification, the algorithm requires no external spikes indicating the desired spike timings unlike existing algorithms. Furthermore, because the algorithm is consistent with biological models and hypotheses found in existing biological studies, it could capture the mechanism underlying biological delay learning. PMID:29209191
Matsubara, Takashi
2017-01-01
Precise spike timing is considered to play a fundamental role in communications and signal processing in biological neural networks. Understanding the mechanism of spike timing adjustment would deepen our understanding of biological systems and enable advanced engineering applications such as efficient computational architectures. However, the biological mechanisms that adjust and maintain spike timing remain unclear. Existing algorithms adopt a supervised approach, which adjusts the axonal conduction delay and synaptic efficacy until the spike timings approximate the desired timings. This study proposes a spike timing-dependent learning model that adjusts the axonal conduction delay and synaptic efficacy in both unsupervised and supervised manners. The proposed learning algorithm approximates the Expectation-Maximization algorithm, and classifies the input data encoded into spatio-temporal spike patterns. Even in the supervised classification, the algorithm requires no external spikes indicating the desired spike timings unlike existing algorithms. Furthermore, because the algorithm is consistent with biological models and hypotheses found in existing biological studies, it could capture the mechanism underlying biological delay learning.
Approximation algorithms for planning and control
NASA Technical Reports Server (NTRS)
Boddy, Mark; Dean, Thomas
1989-01-01
A control system operating in a complex environment will encounter a variety of different situations, with varying amounts of time available to respond to critical events. Ideally, such a control system will do the best possible with the time available. In other words, its responses should approximate those that would result from having unlimited time for computation, where the degree of the approximation depends on the amount of time it actually has. There exist approximation algorithms for a wide variety of problems. Unfortunately, the solution to any reasonably complex control problem will require solving several computationally intensive problems. Algorithms for successive approximation are a subclass of the class of anytime algorithms, algorithms that return answers for any amount of computation time, where the answers improve as more time is allotted. An architecture is described for allocating computation time to a set of anytime algorithms, based on expectations regarding the value of the answers they return. The architecture described is quite general, producing optimal schedules for a set of algorithms under widely varying conditions.
Solar power satellite life-cycle energy recovery consideration
NASA Astrophysics Data System (ADS)
Weingartner, S.; Blumenberg, J.
The construction, in-orbit installation and maintenance of a solar power satellite (SPS) will demand large amounts of energy. As a minimum requirement for an energy effective power satellite it is asked that this amount of energy be recovered. The energy effectiveness in this sense resulting in a positive net energy balance is a prerequisite for cost-effective power satellite. This paper concentrates on life-cycle energy recovery instead on monetary aspects. The trade-offs between various power generation systems (different types of solar cells, solar dynamic), various construction and installation strategies (using terrestrial or extra-terrestrial resources) and the expected/required lifetime of the SPS are reviewed. The presented work is based on a 2-year study performed at the Technical University of Munich. The study showed that the main energy which is needed to make a solar power satellite a reality is required for the production of the solar power components (up to 65%), especially for the solar cell production. Whereas transport into orbit accounts in the order of 20% and the receiving station on earth (rectenna) requires about 15% of the total energy investment. The energetic amortization time, i.e. the time the SPS has to be operational to give back the amount of energy which was needed for its production installation and operation, is about two years.
Collaborative Manufacturing for Small-Medium Enterprises
NASA Astrophysics Data System (ADS)
Irianto, D.
2016-02-01
Manufacturing systems involve decisions concerning production processes, capacity, planning, and control. In a MTO manufacturing systems, strategic decisions concerning fulfilment of customer requirement, manufacturing cost, and due date of delivery are the most important. In order to accelerate the decision making process, research on decision making structure when receiving order and sequencing activities under limited capacity is required. An effective decision making process is typically required by small-medium components and tools maker as supporting industries to large industries. On one side, metal small-medium enterprises are expected to produce parts, components or tools (i.e. jigs, fixture, mold, and dies) with high precision, low cost, and exact delivery time. On the other side, a metal small- medium enterprise may have weak bargaining position due to aspects such as low production capacity, limited budget for material procurement, and limited high precision machine and equipment. Instead of receiving order exclusively, a small-medium enterprise can collaborate with other small-medium enterprise in order to fulfill requirements high quality, low manufacturing cost, and just in time delivery. Small-medium enterprises can share their best capabilities to form effective supporting industries. Independent body such as community service at university can take a role as a collaboration manager. The Laboratory of Production Systems at Bandung Institute of Technology has implemented shared manufacturing systems for small-medium enterprise collaboration.
A scalable quantum computer with ions in an array of microtraps
Cirac; Zoller
2000-04-06
Quantum computers require the storage of quantum information in a set of two-level systems (called qubits), the processing of this information using quantum gates and a means of final readout. So far, only a few systems have been identified as potentially viable quantum computer models--accurate quantum control of the coherent evolution is required in order to realize gate operations, while at the same time decoherence must be avoided. Examples include quantum optical systems (such as those utilizing trapped ions or neutral atoms, cavity quantum electrodynamics and nuclear magnetic resonance) and solid state systems (using nuclear spins, quantum dots and Josephson junctions). The most advanced candidates are the quantum optical and nuclear magnetic resonance systems, and we expect that they will allow quantum computing with about ten qubits within the next few years. This is still far from the numbers required for useful applications: for example, the factorization of a 200-digit number requires about 3,500 qubits, rising to 100,000 if error correction is implemented. Scalability of proposed quantum computer architectures to many qubits is thus of central importance. Here we propose a model for an ion trap quantum computer that combines scalability (a feature usually associated with solid state proposals) with the advantages of quantum optical systems (in particular, quantum control and long decoherence times).
Scientific Application Requirements for Leadership Computing at the Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahern, Sean; Alam, Sadaf R; Fahey, Mark R
2007-12-01
The Department of Energy s Leadership Computing Facility, located at Oak Ridge National Laboratory s National Center for Computational Sciences, recently polled scientific teams that had large allocations at the center in 2007, asking them to identify computational science requirements for future exascale systems (capable of an exaflop, or 1018 floating point operations per second). These requirements are necessarily speculative, since an exascale system will not be realized until the 2015 2020 timeframe, and are expressed where possible relative to a recent petascale requirements analysis of similar science applications [1]. Our initial findings, which beg further data collection, validation, andmore » analysis, did in fact align with many of our expectations and existing petascale requirements, yet they also contained some surprises, complete with new challenges and opportunities. First and foremost, the breadth and depth of science prospects and benefits on an exascale computing system are striking. Without a doubt, they justify a large investment, even with its inherent risks. The possibilities for return on investment (by any measure) are too large to let us ignore this opportunity. The software opportunities and challenges are enormous. In fact, as one notable computational scientist put it, the scale of questions being asked at the exascale is tremendous and the hardware has gotten way ahead of the software. We are in grave danger of failing because of a software crisis unless concerted investments and coordinating activities are undertaken to reduce and close this hardwaresoftware gap over the next decade. Key to success will be a rigorous requirement for natural mapping of algorithms to hardware in a way that complements (rather than competes with) compilers and runtime systems. The level of abstraction must be raised, and more attention must be paid to functionalities and capabilities that incorporate intent into data structures, are aware of memory hierarchy, possess fault tolerance, exploit asynchronism, and are power-consumption aware. On the other hand, we must also provide application scientists with the ability to develop software without having to become experts in the computer science components. Numerical algorithms are scattered broadly across science domains, with no one particular algorithm being ubiquitous and no one algorithm going unused. Structured grids and dense linear algebra continue to dominate, but other algorithm categories will become more common. A significant increase is projected for Monte Carlo algorithms, unstructured grids, sparse linear algebra, and particle methods, and a relative decrease foreseen in fast Fourier transforms. These projections reflect the expectation of much higher architecture concurrency and the resulting need for very high scalability. The new algorithm categories that application scientists expect to be increasingly important in the next decade include adaptive mesh refinement, implicit nonlinear systems, data assimilation, agent-based methods, parameter continuation, and optimization. The attributes of leadership computing systems expected to increase most in priority over the next decade are (in order of importance) interconnect bandwidth, memory bandwidth, mean time to interrupt, memory latency, and interconnect latency. The attributes expected to decrease most in relative priority are disk latency, archival storage capacity, disk bandwidth, wide area network bandwidth, and local storage capacity. These choices by application developers reflect the expected needs of applications or the expected reality of available hardware. One interpretation is that the increasing priorities reflect the desire to increase computational efficiency to take advantage of increasing peak flops [floating point operations per second], while the decreasing priorities reflect the expectation that computational efficiency will not increase. Per-core requirements appear to be relatively static, while aggregate requirements will grow with the system. This projection is consistent with a relatively small increase in performance per core with a dramatic increase in the number of cores. Leadership system software must face and overcome issues that will undoubtedly be exacerbated at the exascale. The operating system (OS) must be as unobtrusive as possible and possess more stability, reliability, and fault tolerance during application execution. As applications will be more likely at the exascale to experience loss of resources during an execution, the OS must mitigate such a loss with a range of responses. New fault tolerance paradigms must be developed and integrated into applications. Just as application input and output must not be an afterthought in hardware design, job management, too, must not be an afterthought in system software design. Efficient scheduling of those resources will be a major obstacle faced by leadership computing centers at the exas...« less
Towards lexicographic multi-objective linear programming using grossone methodology
NASA Astrophysics Data System (ADS)
Cococcioni, Marco; Pappalardo, Massimo; Sergeyev, Yaroslav D.
2016-10-01
Lexicographic Multi-Objective Linear Programming (LMOLP) problems can be solved in two ways: preemptive and nonpreemptive. The preemptive approach requires the solution of a series of LP problems, with changing constraints (each time the next objective is added, a new constraint appears). The nonpreemptive approach is based on a scalarization of the multiple objectives into a single-objective linear function by a weighted combination of the given objectives. It requires the specification of a set of weights, which is not straightforward and can be time consuming. In this work we present both mathematical and software ingredients necessary to solve LMOLP problems using a recently introduced computational methodology (allowing one to work numerically with infinities and infinitesimals) based on the concept of grossone. The ultimate goal of such an attempt is an implementation of a simplex-like algorithm, able to solve the original LMOLP problem by solving only one single-objective problem and without the need to specify finite weights. The expected advantages are therefore obvious.
Quality assurance: Importance of systems and standard operating procedures
Manghani, Kishu
2011-01-01
It is mandatory for sponsors of clinical trials and contract research organizations alike to establish, manage and monitor their quality control and quality assurance systems and their integral standard operating procedures and other quality documents to provide high-quality products and services to fully satisfy customer needs and expectations. Quality control and quality assurance systems together constitute the key quality systems. Quality control and quality assurance are parts of quality management. Quality control is focused on fulfilling quality requirements, whereas quality assurance is focused on providing confidence that quality requirements are fulfilled. The quality systems must be commensurate with the Company business objectives and business model. Top management commitment and its active involvement are critical in order to ensure at all times the adequacy, suitability, effectiveness and efficiency of the quality systems. Effective and efficient quality systems can promote timely registration of drugs by eliminating waste and the need for rework with overall financial and social benefits to the Company. PMID:21584180
The MST radar technique: Requirements for operational weather forecasting
NASA Technical Reports Server (NTRS)
Larsen, M. F.
1983-01-01
There is a feeling that the accuracy of mesoscale forecasts for spatial scales of less than 1000 km and time scales of less than 12 hours can be improved significantly if resources are applied to the problem in an intensive effort over the next decade. Since the most dangerous and damaging types of weather occur at these scales, there are major advantages to be gained if such a program is successful. The interest in improving short term forecasting is evident. The technology at the present time is sufficiently developed, both in terms of new observing systems and the computing power to handle the observations, to warrant an intensive effort to improve stormscale forecasting. An assessment of the extent to which the so-called MST radar technique fulfills the requirements for an operational mesoscale observing network is reviewed and the extent to which improvements in various types of forecasting could be expected if such a network is put into operation are delineated.
Quality assurance: Importance of systems and standard operating procedures.
Manghani, Kishu
2011-01-01
It is mandatory for sponsors of clinical trials and contract research organizations alike to establish, manage and monitor their quality control and quality assurance systems and their integral standard operating procedures and other quality documents to provide high-quality products and services to fully satisfy customer needs and expectations. Quality control and quality assurance systems together constitute the key quality systems. Quality control and quality assurance are parts of quality management. Quality control is focused on fulfilling quality requirements, whereas quality assurance is focused on providing confidence that quality requirements are fulfilled. The quality systems must be commensurate with the Company business objectives and business model. Top management commitment and its active involvement are critical in order to ensure at all times the adequacy, suitability, effectiveness and efficiency of the quality systems. Effective and efficient quality systems can promote timely registration of drugs by eliminating waste and the need for rework with overall financial and social benefits to the Company.
Dai, Wei; Fu, Caroline; Khant, Htet A.; Ludtke, Steven J.; Schmid, Michael F.; Chiu, Wah
2015-01-01
Advances in electron cryo-tomography have provided a new opportunity to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase contrast optics produces images with dramatically increased contrast compared to images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods to obtain 3D structures of cyanophage assembly intermediates in the host, by subtomogram alignment, classification and averaging. Acquiring three to four tomographic tilt series takes approximately 12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. Time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume. PMID:25321408
Indium antimonide large-format detector arrays
NASA Astrophysics Data System (ADS)
Davis, Mike; Greiner, Mark
2011-06-01
Large format infrared imaging sensors are required to achieve simultaneously high resolution and wide field of view image data. Infrared sensors are generally required to be cooled from room temperature to cryogenic temperatures in less than 10 min thousands of times during their lifetime. The challenge is to remove mechanical stress, which is due to different materials with different coefficients of expansion, over a very wide temperature range and at the same time, provide a high sensitivity and high resolution image data. These challenges are met by developing a hybrid where the indium antimonide detector elements (pixels) are unconnected islands that essentially float on a silicon substrate and form a near perfect match to the silicon read-out circuit. Since the pixels are unconnected and isolated from each other, the array is reticulated. This paper shows that the front side illuminated and reticulated element indium antimonide focal plane developed at L-3 Cincinnati Electronics are robust, approach background limited sensitivity limit, and provide the resolution expected of the reticulated pixel array.
Basch, Ethan; Pugh, Stephanie L; Dueck, Amylou C; Mitchell, Sandra A; Berk, Lawrence; Fogh, Shannon; Rogak, Lauren J; Gatewood, Marcha; Reeve, Bryce B; Mendoza, Tito R; O’Mara, Ann; Denicoff, Andrea; Minasian, Lori; Bennett, Antonia V; Setser, Ann; Schrag, Deborah; Roof, Kevin; Moore, Joan K; Gergel, Thomas; Stephans, Kevin; Rimner, Andreas; DeNittis, Albert; Bruner, Deborah Watkins
2017-01-01
Purpose To assess the feasibility of measuring symptomatic adverse events (AEs) in a multicenter clinical trial using the National Cancer Institute’s Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE). Methods and Materials Patients enrolled in Trial XXXX (XXXX) were asked to self-report 53 PRO-CTCAE items representing 30 symptomatic AEs at 6 time points (baseline; weekly x4 during treatment; 12-weeks post-treatment). Reporting was conducted via wireless tablet computers in clinic waiting areas. Compliance was defined as the proportion of visits when an expected PRO-CTCAE assessment was completed. Results Among 226 study sites participating in Trial XXXX, 100% completed 35-minute PRO-CTCAE training for clinical research associates (CRAs); 80 sites enrolled patients of which 34 (43%) required tablet computers to be provided. All 152 patients in Trial XXXX agreed to self-report using the PRO-CTCAE (median age 66; 47% female; 84% white). Median time for CRAs to learn the system was 60 minutes (range 30–240), and median time for CRAs to teach a patient to self-report was 10 minutes (range 2–60). Compliance was high, particularly during active treatment when patients self-reported at 86% of expected time points, although compliance was lower post-treatment (72%). Common reasons for non-compliance were institutional errors such as forgetting to provide computers to participants; patients missing clinic visits; internet connectivity; and patients feeling “too sick”. Conclusions Most patients enrolled in a multicenter chemoradiotherapy trial were willing and able to self-report symptomatic adverse events at visits using tablet computers. Minimal effort was required by local site staff to support this system. The observed causes of missing data may be obviated by allowing patients to self-report electronically between-visits, and by employing central compliance monitoring. These approaches are being incorporated into ongoing studies. PMID:28463161
Time and outcome framing in intertemporal tradeoffs.
Scholten, Marc; Read, Daniel
2013-07-01
A robust anomaly in intertemporal choice is the delay-speedup asymmetry: Receipts are discounted more, and payments are discounted less, when delayed than when expedited over the same interval. We developed 2 versions of the tradeoff model (Scholten & Read, 2010) to address such situations, in which an outcome is expected at a given time but then its timing is changed. The outcome framing model generalizes the approach taken by the hyperbolic discounting model (Loewenstein & Prelec, 1992): Not obtaining a positive outcome when expected is a worse than expected state, to which people are over-responsive, or hypersensitive, and not incurring a negative outcome when expected is a better than expected state, to which people are under-responsive, or hyposensitive. The time framing model takes a new approach: Delaying a positive outcome or speeding up a negative one involves a loss of time to which people are hypersensitive, and speeding up a positive outcome or delaying a negative one involves a gain of time to which people are hyposensitive. We compare the models on their quantitative predictions of indifference data from matching and preference data from choice. The time framing model systematically outperforms the outcome framing model. PsycINFO Database Record (c) 2013 APA, all rights reserved.
5 CFR 470.301 - Program expectations.
Code of Federal Regulations, 2010 CFR
2010-01-01
....301 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PERSONNEL MANAGEMENT RESEARCH PROGRAMS AND DEMONSTRATIONS PROJECTS Regulatory Requirements Pertaining to Demonstration Projects § 470.301 Program expectations. (a) Demonstration projects permit the Office of Personnel...
Providing support to nursing students in the clinical environment: a nursing standard requirement.
Anderson, Carina; Moxham, Lorna; Broadbent, Marc
2016-10-01
This discussion paper poses the question 'What enables or deters Registered Nurses to take up their professional responsibility to support undergraduate nursing students through the provision of clinical education?'. Embedded within many nursing standards are expectations that Registered Nurses provide support and professional development to undergraduate nursing students undertaking clinical placements. Expectations within nursing standards that Registered Nurses provide support and professional development to nursing students are important because nursing students depend on Registered Nurses to help them to become competent practitioners. Contributing factors that enable and deter Registered Nurses from fulfilling this expectation to support nursing students in their clinical learning include; workloads, preparedness for the teaching role, confidence in teaching and awareness of the competency requirement to support students. Factors exist which can enable or deter Registered Nurses from carrying out the licence requirement to provide clinical education and support to nursing students.
Possible Improvements to MCNP6 and its CEM/LAQGSM Event-Generators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashnik, Stepan Georgievich
2015-08-04
This report is intended to the MCNP6 developers and sponsors of MCNP6. It presents a set of suggested possible future improvements to MCNP6 and to its CEM03.03 and LAQGSM03.03 event-generators. A few suggested modifications of MCNP6 are quite simple, aimed at avoiding possible problems with running MCNP6 on various computers, i.e., these changes are not expected to change or improve any results, but should make the use of MCNP6 easier; such changes are expected to require limited man-power resources. On the other hand, several other suggested improvements require a serious further development of nuclear reaction models, are expected to improvemore » significantly the predictive power of MCNP6 for a number of nuclear reactions; but, such developments require several years of work by real experts on nuclear reactions.« less
A forward view on reliable computers for flight control
NASA Technical Reports Server (NTRS)
Goldberg, J.; Wensley, J. H.
1976-01-01
The requirements for fault-tolerant computers for flight control of commercial aircraft are examined; it is concluded that the reliability requirements far exceed those typically quoted for space missions. Examination of circuit technology and alternative computer architectures indicates that the desired reliability can be achieved with several different computer structures, though there are obvious advantages to those that are more economic, more reliable, and, very importantly, more certifiable as to fault tolerance. Progress in this field is expected to bring about better computer systems that are more rigorously designed and analyzed even though computational requirements are expected to increase significantly.
Separation of Time-Based and Trial-Based Accounts of the Partial Reinforcement Extinction Effect
Bouton, Mark E.; Woods, Amanda M.; Todd, Travis P.
2013-01-01
Two appetitive conditioning experiments with rats examined time-based and trial-based accounts of the partial reinforcement extinction effect (PREE). In the PREE, the loss of responding that occurs in extinction is slower when the conditioned stimulus (CS) has been paired with a reinforcer on some of its presentations (partially reinforced) instead of every presentation (continuously reinforced). According to a time-based or “time-accumulation” view (e.g., Gallistel & Gibbon, 2000), the PREE occurs because the organism has learned in partial reinforcement to expect the reinforcer after a larger amount of time has accumulated in the CS over trials. In contrast, according to a trial-based view (e.g., Capaldi, 1967), the PREE occurs because the organism has learned in partial reinforcement to expect the reinforcer after a larger number of CS presentations. Experiment 1 used a procedure that equated partially- and continuously-reinforced groups on their expected times to reinforcement during conditioning. A PREE was still observed. Experiment 2 then used an extinction procedure that allowed time in the CS and the number of trials to accumulate differentially through extinction. The PREE was still evident when responding was examined as a function of expected time units to the reinforcer, but was eliminated when responding was examined as a function of expected trial units to the reinforcer. There was no evidence that the animal responded according to the ratio of time accumulated during the CS in extinction over the time in the CS expected before the reinforcer. The results thus favor a trial-based account over a time-based account of extinction and the PREE. PMID:23962669
Separation of time-based and trial-based accounts of the partial reinforcement extinction effect.
Bouton, Mark E; Woods, Amanda M; Todd, Travis P
2014-01-01
Two appetitive conditioning experiments with rats examined time-based and trial-based accounts of the partial reinforcement extinction effect (PREE). In the PREE, the loss of responding that occurs in extinction is slower when the conditioned stimulus (CS) has been paired with a reinforcer on some of its presentations (partially reinforced) instead of every presentation (continuously reinforced). According to a time-based or "time-accumulation" view (e.g., Gallistel and Gibbon, 2000), the PREE occurs because the organism has learned in partial reinforcement to expect the reinforcer after a larger amount of time has accumulated in the CS over trials. In contrast, according to a trial-based view (e.g., Capaldi, 1967), the PREE occurs because the organism has learned in partial reinforcement to expect the reinforcer after a larger number of CS presentations. Experiment 1 used a procedure that equated partially and continuously reinforced groups on their expected times to reinforcement during conditioning. A PREE was still observed. Experiment 2 then used an extinction procedure that allowed time in the CS and the number of trials to accumulate differentially through extinction. The PREE was still evident when responding was examined as a function of expected time units to the reinforcer, but was eliminated when responding was examined as a function of expected trial units to the reinforcer. There was no evidence that the animal responded according to the ratio of time accumulated during the CS in extinction over the time in the CS expected before the reinforcer. The results thus favor a trial-based account over a time-based account of extinction and the PREE. This article is part of a Special Issue entitled: Associative and Temporal Learning. Copyright © 2013 Elsevier B.V. All rights reserved.
A Potential Operational CryoSat Follow-on Mission Concept and Design
NASA Astrophysics Data System (ADS)
Cullen, R.
2015-12-01
CryoSat was a planned as a 3 year mission with clear mission objectives to allow the assessment rates of change of thickness in the land and marine ice fields with reduced uncertainties with relation to other non-dedicated missions. Although CryoSat suffered a launch failure in Oct 2005, the mission was recovered with a launch in April 2010 of CryoSat-2. The nominal mission has now been completed, all mission requirements have been fulfilled and CryoSat has been shown to be most successful as a dedicated polar ice sheet measurement system demonstrated by nearly 200 peer reviewed publications within the first four years of launch. Following the completion of the nominal mission in Oct 2013 the platform was shown to be in good health and with a scientific backing provided by the ESA Earth Science Advisory Committee (ESAC) the mission has been extended until Feb 2017 by the ESA Programme Board for Earth Observation. Though not designed to provide data for science and operational services beyond its original mission requirements, a number of services have been developed for exploitation and these are expected to increase over the next few years. Services cover a number of aspects of land and marine ice fields in addition to complementary activities covering glacial monitoring, inland water in addition to coastal and open ocean surface topography science that CryoSat has demonstrated world leading advances with. This paper will present the overall concept for a potential low-cost follow-on to the CryoSat mission with the objective to provide both continuity of the existing CryoSat based data sets, i.e., longer term science and operational services that cannot be provided by the existing Copernicus complement of satellites. This is, in part, due to the high inclination (92°) drifting orbit and state of the art Synthetic Aperture Interferometer Radar Altimeter (SIRAL). In addition, further improvements in performance are expected by use of the instrument timing and digital hardware developments used in the Sentinel-6/Jason-CS Poseidon-4 design. It is expected that the mission will also provide data for global ocean services complementary to those of the other Sentinel 3 and 6 missions. With the current planning the development of the potential is expected to commence during 2016 launch in the 2021 time frame.
Study of Allocation Guaranteed Time Slot Wireless Body Area Networks Based on IEEE 802.15.4
NASA Astrophysics Data System (ADS)
Yundra, E.; Harsono, G. D.
2018-04-01
This paper aims to determine the size of the Guaranteed Time Slot (GTS) on the super frame structure required for each sensor as well as to know the performance of the GTS resized system compared to the GTS standard on IEEE 802.15.4. This article proposes a scheme to improve IEEE 802.15.4 medium access control, called allocation Guaranteed Time Slot (ALGATIS). ALGATIS is expected to effectively allocate guaranteed time slot to the requested sensors, it adjusts the length of the slot in super frame duration based on the length of the packet data. This article presents a simulation experiment of IEEE 802.15.4, especially for star network, to predict the throughput of networks and average energy consumption. The simulation experiments show that the performance of ALGATIS is better than that of IEEE 802.15.4 standard in term of the throughput of networks and average energy consumption
Holbein, Christina E; Zebracki, Kathy; Bechtel, Colleen F; Papadakis, Jaclyn Lennon; Bruno, Elizabeth Franks; Holmbeck, Grayson N
2016-01-01
Aim To assess changes over time in parents' expectations of adult milestone achievement (college attendance, full-time job attainment, independent living, marriage, parenthood) for young people with spina bifida, to examine how expectancies relate to actual milestone achievement, and to compare milestone achievement in emerging adults with spina bifida with that of peers with typical development. Method Sixty-eight families of children with spina bifida (mean=8.34y, 37 male, 31 female) and 68 families of children with typical development (mean=8.49y, 37 male, 31 female) participated at Time 1. At all subsequent timepoints, parents of young people with spina bifida were asked to rate their expectations of emerging adulthood milestone achievement. At Time 7, when participants were 22 to 23 years old, milestone achievement was assessed. Results Parents of young people with spina bifida lowered their expectations over time for most milestones; parents of children with higher cognitive ability reported decreases of lower magnitude. Parent expectancies were optimistic and unrelated to actual milestone achievement. Emerging adults with spina bifida were less likely than individuals with typical development to achieve all milestones. Interpretation Optimistic parental expectations may be adaptive for children with spina bifida and their families, although it is important for families to set realistic goals. Healthcare providers serve a key role in helping families of young people with spina bifida prepare for emerging adulthood. PMID:27651215
An Anticipatory Model of Cavitation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allgood, G.O.; Dress, W.B., Jr.; Hylton, J.O.
1999-04-05
The Anticipatory System (AS) formalism developed by Robert Rosen provides some insight into the problem of embedding intelligent behavior in machines. AS emulates the anticipatory behavior of biological systems. AS bases its behavior on its expectations about the near future and those expectations are modified as the system gains experience. The expectation is based on an internal model that is drawn from an appeal to physical reality. To be adaptive, the model must be able to update itself. To be practical, the model must run faster than real-time. The need for a physical model and the requirement that the modelmore » execute at extreme speeds, has held back the application of AS to practical problems. Two recent advances make it possible to consider the use of AS for practical intelligent sensors. First, advances in transducer technology make it possible to obtain previously unavailable data from which a model can be derived. For example, acoustic emissions (AE) can be fed into a Bayesian system identifier that enables the separation of a weak characterizing signal, such as the signature of pump cavitation precursors, from a strong masking signal, such as a pump vibration feature. The second advance is the development of extremely fast, but inexpensive, digital signal processing hardware on which it is possible to run an adaptive Bayesian-derived model faster than real-time. This paper reports the investigation of an AS using a model of cavitation based on hydrodynamic principles and Bayesian analysis of data from high-performance AE sensors.« less
Management strategies of mothers of school-age children with autism: implications for practice.
Joosten, Annette V; Safe, Anneleise P
2014-08-01
Mothering children with autism results in mothers spending more time on daily tasks as well as managing the disorder. The need for mothers to self-manage often increases when the child is school aged. Mothers develop strategies, and occupational therapists and other health professional rely on or expect mothers to be involved in meeting the extra needs of their children with autism and other family members. Little is known about the strategies adopted by the mothers. The aim of this study was to explore the strategies mothers used to manage their roles and emotions, and their child's behaviours. In-depth individual interviews were conducted with seven mothers and data were analysed in this qualitative study using phenomenological analysis. Findings revealed that the mothers had adopted strategies to manage their roles, their emotions and their child's behaviour. However, the strategies were often shaped by the expectations of others or circumstances beyond their control and at times added further to their stress. Mothers of children with autism developed strategies to self-manage their lives and their child's disorder. However, even when these strategies were effective, they sometimes placed further stress on the mothers. The mothers provided insights to how they coped but need help to consider the support they require and therapists need to consider the pressures of expecting mothers to self-manage their child's disorder, their own lives and their family. Family-centred practice emphasising collaboration with mothers needs to be maintained with school-aged children. © 2014 Occupational Therapy Australia.
Critical care trainees' career goals and needs: A Canadian survey.
St-Onge, Maude; Mandelzweig, Keren; Marshall, John C; Scales, Damon C; Granton, John
2014-01-01
For training programs to meet the needs of trainees, an understanding of their career goals and expectations is required. Canadian critical care medicine (CCM) trainees were surveyed to understand their career goals in terms of clinical work, research, teaching, administration and management; and to identify their perceptions regarding the support they need to achieve their goals. The online survey was sent to all trainees registered in a Canadian adult or pediatric CCM program. It documented the participants' demographics; their career expectations; the perceived barriers and enablers to achieve their career goals; and their perceptions relating to their chances of developing a career in different areas. A response rate of 85% (66 of 78) was obtained. The majority expected to work in an academic centre. Only approximately one-third (31%) estimated their chances of obtaining a position in CCM as >75%. The majority planned to devote 25% to 75% of their time performing clinical work and <25% in education, research or administration. The trainees perceived that there were limited employment opportunities. Networking and having specialized expertise were mentioned as being facilitators for obtaining employment. They expressed a need for more protected time, resources and mentorship for nonclinical tasks during training. CCM trainees perceived having only limited support to help them to achieve their career goals and anticipate difficulties in obtaining successful employment. They identified several gaps that could be addressed by training programs, including more mentoring in the areas of research, education and administration.
NASA Astrophysics Data System (ADS)
Aviles-Espinosa, Rodrigo; Santos, Susana I. C. O.; Brodschelm, Andreas; Kaenders, Wilhelm G.; Alonso-Ortega, Cesar; Artigas, David; Loza-Alvarez, Pablo
2011-03-01
In-vivo microscopic long term time-lapse studies require controlled imaging conditions to preserve sample viability. Therefore it is crucial to meet specific exposure conditions as these may limit the applicability of established techniques. In this work we demonstrate the use of third harmonic generation (THG) microscopy for long term time-lapse three-dimensional studies (4D) in living Caenorhabditis elegans embryos employing a 1550 nm femtosecond fiber laser. We take advantage of the fact that THG only requires the existence of interfaces to generate signal or a change in the refractive index or in the χ3 nonlinear coefficient, therefore no markers are required. In addition, by using this wavelength the emitted THG signal is generated at visible wavelengths (516 nm) enabling the use of standard collection optics and detectors operating near their maximum efficiency. This enables the reduction of the incident light intensity at the sample plane allowing to image the sample for several hours. THG signal is obtained through all embryo development stages, providing different tissue/structure information. By means of control samples, we demonstrate that the expected water absorption at this wavelength does not severely compromise sample viability. Certainly, this technique reduces the complexity of sample preparation (i.e. genetic modification) required by established linear and nonlinear fluorescence based techniques. We demonstrate the non-invasiveness, reduced specimen interference, and strong potential of this particular wavelength to be used to perform long-term 4D recordings.
Sorg, Heiko; Knobloch, Karsten
2012-01-01
First quantitative evaluation of the requirements for the promotion as associate professor (AP) at German medical faculties. Analysis of the AP-regulations of German medical faculties according to a validated scoring system, which has been adapted to this study. The overall scoring for the AP-requirements at 35 German medical faculties was 13.5±0.6 of 20 possible scoring points (95% confidence interval 12.2-14.7). More than 88% of the AP-regulations demand sufficient performance in teaching and research with adequate scientific publication. Furthermore, 83% of the faculties expect an expert review of the candidate's performance. Conference presentations required as an assistant professor as well as the reduction of the minimum time as an assistant professor do only play minor roles. The requirements for assistant professors to get nominated as an associate professor at German medical faculties are high with an only small range. In detail, however, it can be seen that there still exists large heterogeneity, which hinders equal opportunities and career possibilities. These data might be used for the ongoing objective discussion.
Parametric Evaluation of Interstellar Exploration Mission Concepts
NASA Technical Reports Server (NTRS)
Adams, Robert B.
2017-01-01
One persistent difficulty in evaluating the myriad advanced propulsion concepts proposed over the last 60 years is a true apples to apples comparison of the expected gain in performance. This analysis is complicated by numerous factors including, multiple missions of interest to the advanced propulsion community, the lack of a credible closed form solution to 'medium thrust' trajectories, and lack of detailed design data for most proposed concepts that lend credibility to engine performance estimates. This paper describes a process on how to make fair comparisons of different propulsion concepts for multiple missions over a wide range of performance values. The figure below illustrates this process. This paper describes in detail the process and outlines the status so far in compiling the required data. Parametric data for several missions are calculated and plotted against specific power-specific impulse scatter plots of expected propulsion system performance. The overlay between required performance as defined by the trajectory parametrics and expected performance as defined in the literature for major categories of propulsion systems clearly defines which propulsion systems are the most apt for a given mission. The application of the Buckingham Pi theorem to general parameters for interstellar exploration ( mission time, mass, specific impulse, specific power, distance, propulsion source energy/mass, etc.) yields a number of dimensionless variables. The relationships of these variables can then be explored before application to a particular mission. Like in the fields of fluid mechanics and heat transfer, the use of the Buckingham Pi theorem results in new variables to make apples to apples comparisons.
Global Tropospheric Noise Maps for InSAR Observations
NASA Astrophysics Data System (ADS)
Yun, S. H.; Hensley, S.; Agram, P. S.; Chaubell, M.; Fielding, E. J.; Pan, L.
2014-12-01
Radio wave's differential phase delay variation through the troposphere is the largest error sources in Interferometric Synthetic Aperture Radar (InSAR) measurements, and water vapor variability in the troposphere is known to be the dominant factor. We use the precipitable water vapor (PWV) products from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) sensors mounted on Terra and Aqua satellites to produce tropospheric noise maps of InSAR. We estimate the slope and y-intercept of power spectral density curve of MODIS PWV and calculate the structure function to estimate the expected tropospheric noise level as a function of distance. The results serve two purposes: 1) to provide guidance on the expected covariance matrix for geophysical modeling, 2) to provide quantitative basis for the science Level-1 requirements of the planned NASA-ISRO L-band SAR mission (NISAR mission). We populate lookup tables of such power spectrum parameters derived from each 1-by-1 degree tile of global coverage. The MODIS data were retrieved from OSCAR (Online Services for Correcting Atmosphere in Radar) server. Users will be able to use the lookup tables and calculate expected tropospheric noise level of any date of MODIS data at any distance scale. Such calculation results can be used for constructing covariance matrix for geophysical modeling, or building statistics to support InSAR missions' requirements. For example, about 74% of the world had InSAR tropospheric noise level (along a radar line-of-sight for an incidence angle of 40 degrees) of 2 cm or less at 50 km distance scale during the time period of 2010/01/01 - 2010/01/09.
Escudero‐Carretero, María J.; Prieto‐Rodríguez, MaÁngeles; Fernández‐Fernández, Isabel; March‐Cerdá, Joan Carles
2007-01-01
Abstract Aim To understand the expectations held by type 1 and 2 diabetes mellitus (DM 1 & 2) patients and their relatives regarding the health‐care provided to them. Design Qualitative. Focus groups. Setting and participants Andalusia. A theoretical sample that includes the most characteristic profiles. Thirty‐one subjects with DM. Segmentation characteristics: receiving health‐care for DM in Primary or Specialized care, living in urban and rural areas, men and women, age, varying diagnosis times, DM course and consequences. Subjects were recruited by health‐care professionals at reference care centres. Results Patients expect their health‐care professionals to be understanding, to treat them with kindness and respect, to have good communication skills, to provide information in a non‐authoritarian manner while fully acknowledging patients’ know‐how. Regarding the health‐care system, their expectations focus on the system’s ability to respond when required to do so, through a relevant professional, along with readily available equipment for treatment. The expectations of people affected by DM1 focus on leading a normal life and not having their educational, labour, social and family opportunities limited by the disease. Expectations in people with DM2 tend towards avoiding what they know has happened to other patients. Conclusions ‘Facilitating’, is a key word. Both the health‐care system and its professionals must pay keener attention to the emotional aspects of the disease and its process, adopting a comprehensive approach to care. It is vital that health‐care professionals take an active interest in the course of their patient’s disease, promoting accessibility and an atmosphere of trust and flexibility. PMID:17986070
Left behind: widening disparities for males and females in US county life expectancy, 1985–2010
2013-01-01
Background The United States spends more than any other country on health care. The poor relative performance of the US compared to other high-income countries has attracted attention and raised questions about the performance of the US health system. An important dimension to poor national performance is the large disparities in life expectancy. Methods We applied a mixed effects Poisson statistical model and Gaussian Process Regression to estimate age-specific mortality rates for US counties from 1985 to 2010. We generated uncertainty distributions for life expectancy at each age using standard simulation methods. Results Female life expectancy in the United States increased from 78.0 years in 1985 to 80.9 years in 2010, while male life expectancy increased from 71.0 years in 1985 to 76.3 years in 2010. The gap between female and male life expectancy in the United States was 7.0 years in 1985, narrowing to 4.6 years in 2010. For males at the county level, the highest life expectancy steadily increased from 75.5 in 1985 to 81.7 in 2010, while the lowest life expectancy remained under 65. For females at the county level, the highest life expectancy increased from 81.1 to 85.0, and the lowest life expectancy remained around 73. For male life expectancy at the county level, there have been three phases in the evolution of inequality: a period of rising inequality from 1985 to 1993, a period of stable inequality from 1993 to 2002, and rising inequality from 2002 to 2010. For females, in contrast, inequality has steadily increased during the 25-year period. Compared to only 154 counties where male life expectancy remained stagnant or declined, 1,405 out of 3,143 counties (45%) have seen no significant change or a significant decline in female life expectancy from 1985 to 2010. In all time periods, the lowest county-level life expectancies are seen in the South, the Mississippi basin, West Virginia, Kentucky, and selected counties with large Native American populations. Conclusions The reduction in the number of counties where female life expectancy at birth is declining in the most recent period is welcome news. However, the widening disparities between counties and the slow rate of increase compared to other countries should be viewed as a call for action. An increased focus on factors affecting health outcomes, morbidity, and mortality such as socioeconomic factors, difficulty of access to and poor quality of health care, and behavioral, environmental, and metabolic risk factors is urgently required. PMID:23842281
Code of Federal Regulations, 2011 CFR
2011-07-01
... facilities that could reasonably be expected to cause substantial harm to the environment. 154.1040 Section... to the environment. (a) The owner or operator of a facility that, under § 154.1015, could reasonably be expected to cause substantial harm to the environment, shall submit a response plan that meets the...
Code of Federal Regulations, 2012 CFR
2012-07-01
... facilities that could reasonably be expected to cause substantial harm to the environment. 154.1040 Section... to the environment. (a) The owner or operator of a facility that, under § 154.1015, could reasonably be expected to cause substantial harm to the environment, shall submit a response plan that meets the...
Code of Federal Regulations, 2013 CFR
2013-07-01
... facilities that could reasonably be expected to cause substantial harm to the environment. 154.1040 Section... to the environment. (a) The owner or operator of a facility that, under § 154.1015, could reasonably be expected to cause substantial harm to the environment, shall submit a response plan that meets the...
Code of Federal Regulations, 2014 CFR
2014-07-01
... facilities that could reasonably be expected to cause substantial harm to the environment. 154.1040 Section... to the environment. (a) The owner or operator of a facility that, under § 154.1015, could reasonably be expected to cause substantial harm to the environment, shall submit a response plan that meets the...
ERIC Educational Resources Information Center
Winstone, Naomi; Bretton, Hannah
2013-01-01
In negotiating the transition to Higher Education, students bring core expectations from their A-level study that are likely to be different to the lived reality of university study. Bridging the transition to university requires an in-depth understanding of the differences between the imagined and the reality; the expectations and the experience.…
Code of Federal Regulations, 2010 CFR
2010-07-01
... facilities that could reasonably be expected to cause substantial harm to the environment. 154.1040 Section... to the environment. (a) The owner or operator of a facility that, under § 154.1015, could reasonably be expected to cause substantial harm to the environment, shall submit a response plan that meets the...
Meeting the Institute of Medicine’s 2030 US Life Expectancy Target
Kindig, David; Nobles, Jenna; Zidan, Moheb
2018-01-01
Objectives To quantify the improvement in US life expectancy required to reach parity with high-resource nations by 2030, to document historical precedent of this rate, and to discuss the plausibility of achieving this rate in the United States. Methods We performed a demographic analysis of secondary data in 5-year periods from 1985 to 2015. Results To achieve the United Nations projected mortality estimates for Western Europe in 2030, the US life expectancy must grow at 0.32% a year between 2016 and 2030. This rate has precedent, even in low-mortality populations. Over 204 country-periods examined, nearly half exhibited life-expectancy growth greater than 0.32%. Of the 51 US states observed, 8.2% of state-periods demonstrated life-expectancy growth that exceeded the 0.32% target. Conclusions Achieving necessary growth in life expectancy over the next 15 years despite historical precedent will be challenging. Much all-cause mortality is structured decades earlier and, at present, older-age mortality reductions in the United States are decelerating. Addressing mortality decline at all ages will require enhanced political will and a strong commitment to equity improvement in the US population. PMID:29161064
A Framework for Automating Cost Estimates in Assembly Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calton, T.L.; Peters, R.R.
1998-12-09
When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process maymore » iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.« less
Murty, Vishnu P; Adcock, R Alison
2014-08-01
Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expectancy violations. During functional magnetic resonance imaging, participants performed a reaction-time task in which goal-irrelevant expectancy violations were encountered during states of high- or low-reward motivation. Motivation amplified hippocampal activation to and declarative memory for expectancy violations. Connectivity of the ventral tegmental area (VTA) with medial prefrontal, ventrolateral prefrontal, and visual cortices preceded and predicted this increase in hippocampal sensitivity. These findings elucidate a novel mechanism whereby reward motivation can enhance hippocampus-dependent memory: anticipatory VTA-cortical-hippocampal interactions. Further, the findings integrate literatures on dopaminergic neuromodulation of prefrontal function and hippocampus-dependent memory. We conclude that during reward motivation, VTA modulation induces distributed neural changes that amplify hippocampal signals and records of expectancy violations to improve predictions-a potentially unique contribution of the hippocampus to reward learning. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Kyaddondo, David; Mugerwa, Kidza; Byamugisha, Josaphat; Oladapo, Olufemi T; Bohren, Meghan A
2017-12-01
To describe the experiences, expectations, and needs of urban Ugandan women in relation to good-quality facility childbirth. Women who had given birth in the 12 months prior to the study were purposively sampled and interviewed, or included in focus groups. Thematic analysis was used, and the data were interpreted within the context of an existing quality of care framework. Forty-five in-depth interviews and six focus group discussions were conducted. Respect and dignity, timely communication, competent skilled staff, and availability of medical supplies were central to women's accounts of quality care, or a lack of it. The hope for a live baby motivated women to seek facility-based childbirth. They expected to encounter competent, respectful, and caring staff with appropriate skills. In some cases, they could only fulfill these expectations through additional personal financial payments to staff, for clinical supplies, or to guarantee that they would be attended by someone with suitable skills. Long-term improvement in quality of maternity care in Uganda requires enhancement of the interaction between women and health staff in facilities, and investment in staff and resources to ensure that safe, respectful care is not dependent on willingness and/or capacity to pay. © 2017 International Federation of Gynecology and Obstetrics. The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.
Verification of a Remaining Flying Time Prediction System for Small Electric Aircraft
NASA Technical Reports Server (NTRS)
Hogge, Edward F.; Bole, Brian M.; Vazquez, Sixto L.; Celaya, Jose R.; Strom, Thomas H.; Hill, Boyd L.; Smalling, Kyle M.; Quach, Cuong C.
2015-01-01
This paper addresses the problem of building trust in online predictions of a battery powered aircraft's remaining available flying time. A set of ground tests is described that make use of a small unmanned aerial vehicle to verify the performance of remaining flying time predictions. The algorithm verification procedure described here uses a fully functional vehicle that is restrained to a platform for repeated run-to-functional-failure experiments. The vehicle under test is commanded to follow a predefined propeller RPM profile in order to create battery demand profiles similar to those expected in flight. The fully integrated aircraft is repeatedly operated until the charge stored in powertrain batteries falls below a specified lower-limit. The time at which the lower-limit on battery charge is crossed is then used to measure the accuracy of remaining flying time predictions. Accuracy requirements are considered in this paper for an alarm that warns operators when remaining flying time is estimated to fall below a specified threshold.
ATLAS software configuration and build tool optimisation
NASA Astrophysics Data System (ADS)
Rybkin, Grigory; Atlas Collaboration
2014-06-01
ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.
Transport of environmental tracers through a karst system with a thick unsaturated zone
NASA Astrophysics Data System (ADS)
Geyer, Tobias; Sültenfuss, Jürgen; Eichinger, Florian; Sauter, Martin
2010-05-01
The transport of the environmental tracers tritium (3H), krypton-85 (85Kr) and helium (3He) in a karst system is investigated. Differences between mean tracer ages determined in spring water are explained by slow percolation of water through the thick unsaturated zone reflecting the importance of slow and diffuse unsaturated flow processes in these systems. Mean tracer ages on the Gallusquelle spring (Swabian Alb) were determined with lumped parameter modeling and decrease in the following order: 3H >> 85Kr > 3He. Since 3H is part of the water molecule it enters a karst system via precipitation, i.e. the mean 3H age is a measure of water flow through the whole karst system, including the unsaturated and saturated zone. The mean 85Kr age and 3H/3He age are measures of time since groundwater recharge arrived at the water table. Therefore our results indicate a long travel time of 3H through the unsaturated zone of the karst system. The interpretation is supported by a two-dimensional numerical simulation of flow and transport in a fissured matrix block that contains a thick unsaturated zone (ca. 100 m) and is drained by a conduit. Transport simulation is performed in the sense of backtracking, i.e. the flow field is reversed, and the boundary conditions are adapted accordingly. At any position in the model domain, the time required for a water molecule to reach the outlet is estimated corresponding to the "life expectancy" (Cornaton and Perrochet 2006), i.e. the life expectancy on the outlet is zero. The simulation of life expectancy of water in the matrix block shows (1) the importance of heterogeneities for interpretation of groundwater ages, (2) the location of stagnant zones in areas of low hydraulic permeability and/or low hydraulic gradient and (3) that flow through unsaturated fissured matrix blocks may cause a considerable travel time of water through a karst system. The travel time of water from the recharge area to the discharge point for the shown example is about 15 years with a travel time of water through the unsaturated zone of 10 years (Geyer 2008). This result reflects the variation of estimated ages for different tracers sampled at the Gallusquelle spring. Additionally, we demonstrate that depending on boundary conditions, the unsaturated zone of a karst system may provide a large water storage since the porous matrix can be expected to be close to saturation and the volume fraction of fissures and conduits is small. Literature Cornaton, F., Perrochet, P. (2006): Ground-water age, life expectancy and transit time distributions in advective-dispersive systems: 1. Generalized reservoir theory. - Advances in Water Resources 29 (9): 1267-1291. Geyer, T. (2008): Characterisation of flow and transport in karst aquifers at catchment scale, Ph.D. diss., Georg-August-Universität Göttingen, 103 pp.
Requirements based system level risk modeling
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Cornford, Steven; Feather, Martin
2004-01-01
The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements.
Development of an Ultra-Low Background Liquid Scintillation Counter for Trace Level Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erchinger, Jennifer L.; Orrell, John L.; Aalseth, Craig E.
2015-09-01
Low-level liquid scintillation counting (LSC) has been established as one of the radiation detection techniques useful in elucidating environmental processes and environmental monitoring around nuclear facilities. The Ultra-Low Background Liquid Scintillation Counter (ULB-LSC) under construction in the Shallow Underground Laboratory at Pacific Northwest National Laboratory aims to further reduce the MDAs and/or required sample processing. Through layers of passive shielding in conjunction with an active veto and 30 meters water equivalent overburden, the background reduction is expected to be 10 to 100 times below typical analytic low-background liquid scintillation systems. Simulations have shown an expected background of around 14 countsmore » per day. A novel approach to the light collection will use a coated hollow light guide cut into the inner copper shielding. Demonstration LSC measurements will show low-energy detection, spectral deconvolution, and alpha/beta discrimination capabilities, from trials with standards of tritium, strontium-90, and actinium-227, respectively. An overview of the system design and expected demonstration measurements will emphasize the potential applications of the ULB-LSC in environmental monitoring for treaty verification, reach-back sample analysis, and facility inspections.« less
Weiner, Michael; Schadow, Gunther; Lindbergh, Donald; Warvel, Jill; Abernathy, Greg; Perkins, Susan M.; Dexter, Paul R.; McDonald, Clement J.
2002-01-01
We expect the use of real-time, interactive video conferencing to grow, due to more affordable technology and new health policies. Building and implementing portable systems to enable conferencing between physicians and patients requires durable equipment, committed staff, reliable service, and adequate protection and capture of data. We are studying the use of Internet-based conferencing between on-call physicians and patients residing in a nursing facility. We describe the challenges we experienced in constructing the study. Initiating and orchestrating unscheduled conferences needs to be easy, and requirements for training staff in using equipment should be minimal. Studies of health outcomes should include identification of medical conditions most amenable to benefit from conferencing, and outcomes should include positive as well as negative effects. PMID:12463950
NASA Technical Reports Server (NTRS)
Schroeder, Lyle C.; Bailey, M. C.; Harrington, Richard F.; Kendall, Bruce M.; Campbell, Thomas G.
1994-01-01
High-spatial-resolution microwave radiometer sensing from space with reasonable swath widths and revisit times favors large aperture systems. However, with traditional precision antenna design, the size and weight requirements for such systems are in conflict with the need to emphasize small launch vehicles. This paper describes tradeoffs between the science requirements, basic operational parameters, and expected sensor performance for selected satellite radiometer concepts utilizing novel lightweight compactly packaged real apertures. Antenna, feed, and radiometer subsystem design and calibration are presented. Preliminary results show that novel lightweight real aperture coupled with state-of-the-art radiometer designs are compatible with small launch systems, and hold promise for high-resolution earth science measurements of sea ice, precipitation, soil moisture, sea surface temperature, and ocean wind speeds.
Approaches, field considerations and problems associated with radio tracking carnivores
Sargeant, A.B.; Amlaner, C. J.; MacDonald, D.W.
1979-01-01
The adaptation of radio tracking to ecological studies was a major technological advance affecting field investigations of animal movements and behavior. Carnivores have been the recipients of much attention with this new technology and study approaches have varied from simple to complex. Equipment performance has much improved over the years, but users still face many difficulties. The beginning of all radio tracking studies should be a precise definition of objectives. Study objectives dictate type of gear required and field procedures. Field conditions affect equipment performance and investigator ability to gather data. Radio tracking carnivores is demanding and generally requires greater time than anticipated. Problems should be expected and planned for in study design. Radio tracking can be an asset in carnivore studies but caution is needed in its application.
Model Checking Abstract PLEXIL Programs with SMART
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.
2007-01-01
We describe a method to automatically generate discrete-state models of abstract Plan Execution Interchange Language (PLEXIL) programs that can be analyzed using model checking tools. Starting from a high-level description of a PLEXIL program or a family of programs with common characteristics, the generator lays the framework that models the principles of program execution. The concrete parts of the program are not automatically generated, but require the modeler to introduce them by hand. As a case study, we generate models to verify properties of the PLEXIL macro constructs that are introduced as shorthand notation. After an exhaustive analysis, we conclude that the macro definitions obey the intended semantics and behave as expected, but contingently on a few specific requirements on the timing semantics of micro-steps in the concrete executive implementation.
Solvent Hold Tank Sample Results for MCU-16-1247-1248-1249: August 2016 Monthly Sample
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fondeur, F. F.; Jones, D. H.
Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-1247-1248-1249), pulled on 08/22/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-1247-1248-1249 indicated the Isopar™L concentration is above its nominal level (101%). The extractant (MaxCalix) and the modifier (CS-7SB) are 7% and 9 % below their nominal concentrations. The suppressor (TiDG) is 63% below its nominal concentration. This analysis confirms the solvent may require the addition of TiDG, and possibly of modifier and MaxCalix to restore then to nominal levels. Based on the current monthly sample, the levelsmore » of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended. At the time of writing this report, A solvent trim batch containing TiDG, modifier and MaxCalix, was added to the SHT (October 2016) and expect the concentration of these components to be at their nominal values.« less
Time-of-flights and traps: from the Histone Code to Mars.
Cotter, Robert J; Swatkoski, Stepehen; Becker, Luann; Evans-Nguyen, Theresa
2010-01-01
Two very different analytical instruments are featured in this perspective paper on mass spectrometer design and development. The first instrument, based upon the curved-field reflectron developed in the Johns Hopkins Middle Atlantic Mass Spectrometry Laboratory, is a tandem time-of-flight mass spectrometer whose performance and practicality are illustrated by applications to a series of research projects addressing the acetylation, deacetylation and ADP-ribosylation of histone proteins. The chemical derivatization of lysine-rich, hyperacetylated histones as their deuteroacetylated analogs enables one to obtain an accurate quantitative assessment of the extent of acetylation at each site. Chemical acetylation of histone mixtures is also used to determine the lysine targets of sirtuins, an important class of histone deacetylases (HDACs), by replacing the deacetylated residues with biotin. Histone deacetylation by sirtuins requires the co-factor NAD+, as does the attachment of ADP-ribose. The second instrument, a low voltage and low power ion trap mass spectrometer known as the Mars Organic Mass Analyzer (MOMA), is a prototype for an instrument expected to be launched in 2018. Like the tandem mass spectrometer, it is also expected to have applicability to environmental and biological analyses and, ultimately, to clinical care.
Dynamic situation assessment and prediction (DSAP)
NASA Astrophysics Data System (ADS)
Sisti, Alex F.
2003-09-01
The face of war has changed. We no longer have the luxury of planning campaigns against a known enemy operating under a well-understood doctrine, using conventional weapons and rules of engagement; all in a well-charted region. Instead, today's Air Force faces new, unforeseen enemies, asymmetric combat situations and unconventional warfare (Chem/Bio, co-location of military assets near civilian facilities, etc.). At the same time, the emergence of new Air Force doctrinal notions (e.g., Global Strike Task Force, Effects-Based Operations, the desire to minimize or eliminate any collateral damage, etc.)- while propounding the benefits that can be expected with the adoption of such concepts - also impose many new technical and operational challenges. Furthermore, future mission/battle commanders will need to assimilate a tremendous glut of available information, and still be expected to make quick-response decisions - and to quantify the effects of those decisions - all in the face of uncertainty. All these factors translate to the need for dramatic improvements in the way we plan, rehearse, execute and dynamically assess the status of military campaigns. This paper addresses these crucial and revolutionary requirements through the pursuit of a new simulation paradigm that allows a user to perform real-time dynamic situation assessment and prediction.
Regional action plan handling of social welfare problem in nganjuk regency
NASA Astrophysics Data System (ADS)
Zain, IM; Utami, WS; Setyawan, KG
2018-01-01
Local action plans are expected to ensure a social protection for vulnerable and disadvantaged groups or PMKS. The method used in this research is by primary survey and secondary survey. The condition of the people who still belong to PMKS requires the state to come to the community to solve the problems faced. Stakeholders should be involved to handle PMKS. The activities presented should also receive periodic monitoring and evaluation so that there is progress reporting at any time. Implementable poverty reduction strategies and policies are social protection strategies, opportunity expansion strategies, resource capacity building strategies, community empowerment strategies and partnership strategies. The flow of PMKS is the validation and updating of data, the fulfillment of the basic needs of the PMKS family, the development of PMKS human resources, the improvement of the quality of life for poor families, the institutions of poverty alleviation stakeholders and the unemployed at the base level. The Regional Action Plan (RAP) is prepared as a reference in the context of carrying out PMKS mitigation which is expected to serve as a guide for managers and program implementers with relevant agencies that are conducted jointly and continuously for the period of time specified.
A framework for implementing data services in multi-service mobile satellite systems
NASA Technical Reports Server (NTRS)
Ali, Mohammed O.; Leung, Victor C. M.; Spolsky, Andrew I.
1988-01-01
Mobile satellite systems being planned for introduction in the early 1990s are expected to be invariably of the multi-service type. Mobile Telephone Service (MTS), Mobile Radio Service (MRS), and Mobile Data Service (MDS) are the major classifications used to categorize the many user applications to be supported. The MTS and MRS services encompass circuit-switched voice communication applications, and may be efficiently implemented using a centralized Demand-Assigned Multiple Access (DAMA) scheme. Applications under the MDS category are, on the other hand, message-oriented and expected to vary widely in characteristics; from simplex mode short messaging applications to long duration, full-duplex interactive data communication and large file transfer applications. For some applications under this service category, the conventional circuit-based DAMA scheme may prove highly inefficient due to the long time required to set up and establish communication links relative to the actual message transmission time. It is proposed that by defining a set of basic bearer services to be supported in MDS and optimizing their transmission and access schemes independent of the MTS and MRS services, the MDS applications can be more efficiently integrated into the multi-service design of mobile satellite systems.
A framework for implementing data services in multi-service mobile satellite systems
NASA Astrophysics Data System (ADS)
Ali, Mohammed O.; Leung, Victor C. M.; Spolsky, Andrew I.
1988-05-01
Mobile satellite systems being planned for introduction in the early 1990s are expected to be invariably of the multi-service type. Mobile Telephone Service (MTS), Mobile Radio Service (MRS), and Mobile Data Service (MDS) are the major classifications used to categorize the many user applications to be supported. The MTS and MRS services encompass circuit-switched voice communication applications, and may be efficiently implemented using a centralized Demand-Assigned Multiple Access (DAMA) scheme. Applications under the MDS category are, on the other hand, message-oriented and expected to vary widely in characteristics; from simplex mode short messaging applications to long duration, full-duplex interactive data communication and large file transfer applications. For some applications under this service category, the conventional circuit-based DAMA scheme may prove highly inefficient due to the long time required to set up and establish communication links relative to the actual message transmission time. It is proposed that by defining a set of basic bearer services to be supported in MDS and optimizing their transmission and access schemes independent of the MTS and MRS services, the MDS applications can be more efficiently integrated into the multi-service design of mobile satellite systems.
NASA Astrophysics Data System (ADS)
Pennington, D. N.; Nelson, E.; Polasky, S.; Plantinga, A.; Lewis, D.; Whithey, J.; Radeloff, V.; Lawler, J.; White, D.; Martinuzzi, S.; Helmers, D.; Lonsdorf, E.
2011-12-01
Land-use change significantly contributes to biodiversity loss, changes ecosystem processes, and causes ultimately the loss of ecosystem services. Planning for a sustainable future requires a thorough understanding of expected future land use at both the fine-spatial scale relevant for many ecological processes and at the larger regional levels relevant for large-scale policy making. We use an econometric model to predict business as usual land-use change across the continental US with 100-m resolution in 5-year time steps from 2001 to 2051. We then simulate the affect of various national-level tax, subsidy, and zoning policies on expected land-use change over this time frame. Further, we model the impact of projected land-use change under business as usual and the various policy scenarios on carbon sequestration and biodiversity conservation in the conterminous United States. Our results showed that overall, land use composition will remain fairly stable, but there are considerable regional changes. Differences among policy scenarios were relatively minor highlighting that the underlying economic drivers of land use patterns are strong, and even fairly drastic policies may not be able to change these.
Time-of-flights and traps: from the Histone Code to Mars*
Swatkoski, Stephen; Becker, Luann; Evans-Nguyen, Theresa
2011-01-01
Two very different analytical instruments are featured in this perspective paper on mass spectrometer design and development. The first instrument, based upon the curved-field reflectron developed in the Johns Hopkins Middle Atlantic Mass Spectrometry Laboratory, is a tandem time-of-flight mass spectrometer whose performance and practicality are illustrated by applications to a series of research projects addressing the acetylation, deacetylation and ADP-ribosylation of histone proteins. The chemical derivatization of lysine-rich, hyperacetylated histones as their deuteroacetylated analogs enables one to obtain an accurate quantitative assessment of the extent of acetylation at each site. Chemical acetylation of histone mixtures is also used to determine the lysine targets of sirtuins, an important class of histone deacetylases (HDACs), by replacing the deacetylated residues with biotin. Histone deacetylation by sirtuins requires the co-factor NAD+, as does the attachment of ADP-ribose. The second instrument, a low voltage and low power ion trap mass spectrometer known as the Mars Organic Mass Analyzer (MOMA), is a prototype for an instrument expected to be launched in 2018. Like the tandem mass spectrometer, it is also expected to have applicability to environmental and biological analyses and, ultimately, to clinical care. PMID:20530839
Indirect measurement of three-photon correlation in nonclassical light sources
NASA Astrophysics Data System (ADS)
Ann, Byoung-moo; Song, Younghoon; Kim, Junki; Yang, Daeho; An, Kyungwon
2016-06-01
We observe the three-photon correlation in nonclassical light sources by using an indirect measurement scheme based on the dead-time effect of photon-counting detectors. We first develop a general theory which enables us to extract the three-photon correlation from the two-photon correlation of an arbitrary light source measured with detectors with finite dead times. We then confirm the validity of our measurement scheme in experiments done with a cavity-QED microlaser operating with a large intracavity mean photon number exhibiting both sub- and super-Poissonian photon statistics. The experimental results are in good agreement with the theoretical expectation. Our measurement scheme provides an alternative approach for N -photon correlation measurement employing (N -1 ) detectors and thus a reduced measurement time for a given signal-to-noise ratio, compared to the usual scheme requiring N detectors.
Time takes space: selective effects of multitasking on concurrent spatial processing.
Mäntylä, Timo; Coni, Valentina; Kubik, Veit; Todorov, Ivo; Del Missier, Fabio
2017-08-01
Many everyday activities require coordination and monitoring of complex relations of future goals and deadlines. Cognitive offloading may provide an efficient strategy for reducing control demands by representing future goals and deadlines as a pattern of spatial relations. We tested the hypothesis that multiple-task monitoring involves time-to-space transformational processes, and that these spatial effects are selective with greater demands on coordinate (metric) than categorical (nonmetric) spatial relation processing. Participants completed a multitasking session in which they monitored four series of deadlines, running on different time scales, while making concurrent coordinate or categorical spatial judgments. We expected and found that multitasking taxes concurrent coordinate, but not categorical, spatial processing. Furthermore, males showed a better multitasking performance than females. These findings provide novel experimental evidence for the hypothesis that efficient multitasking involves metric relational processing.
Spin-the-bottle Sort and Annealing Sort: Oblivious Sorting via Round-robin Random Comparisons
Goodrich, Michael T.
2013-01-01
We study sorting algorithms based on randomized round-robin comparisons. Specifically, we study Spin-the-bottle sort, where comparisons are unrestricted, and Annealing sort, where comparisons are restricted to a distance bounded by a temperature parameter. Both algorithms are simple, randomized, data-oblivious sorting algorithms, which are useful in privacy-preserving computations, but, as we show, Annealing sort is much more efficient. We show that there is an input permutation that causes Spin-the-bottle sort to require Ω(n2 log n) expected time in order to succeed, and that in O(n2 log n) time this algorithm succeeds with high probability for any input. We also show there is a specification of Annealing sort that runs in O(n log n) time and succeeds with very high probability. PMID:24550575
Perspective: Toward a competency framework for faculty.
Milner, Robert J; Gusic, Maryellen E; Thorndyke, Luanne E
2011-10-01
Today, faculty in academic medicine face challenges in all three mission areas--research, education, and patient care--and require a broad set of competencies to survive in this changing environment. To support faculty and to design assessments that match new expectations, the authors argue that it is essential to capture the full scope of skills, knowledge, and behaviors necessary for a successful faculty member. Thus, it is timely to explore and define competencies for faculty in academic medicine. The authors describe three approaches to identifying faculty competencies. Each reveals diverse but overlapping sets of competency domains, reflecting the breadth of activities expected of today's faculty. To organize these competencies into a coherent framework, the authors propose a model based on a typology of competency. A key feature of the model is the division between occupational competencies, which are largely role-specific, and personal competencies, which are necessary for all faculty. A competency framework also must be developmental, to reflect the growth in skills, knowledge, and behaviors from trainee to expert and to allow for an individual's changing roles over a career. Such a competency framework will inform professional development activities and require assessment of competence. The generation of competencies also will reveal areas of faculty practice that are poorly measured, requiring new tools to be incorporated into existing processes of faculty evaluation. The authors provide general principles to guide the identification of a competency framework for faculty and invite the academic medicine community to engage in further discussion.
NASA Astrophysics Data System (ADS)
Ditsche, Petra; Hicks, Madeline; Truong, Lisa; Linkem, Christina; Summers, Adam
2017-04-01
The Northern clingfish is a small, Eastern North Pacific fish that can attach to rough, fouled rocks in the intertidal. Their ability to attach to surfaces has been measured previously in the laboratory, and in this study, we show the roughness and fouling of the natural habitat of these fish. We introduce a new method for measuring surface roughness of natural substrates with time-limited accessibility. We expect this method to be broadly applicable in studies of animal/substrate surface interactions in habitats difficult to characterize. Our roughness measurements demonstrate that the fish's ability to attach to very coarse roughness is required in its natural environment. Some of the rocks showed even coarser roughness than the fish could attach to in the lab setting. We also characterized the clingfish's preference for other habitat descriptors such as the size of the rocks, biofilm, and Aufwuchs (macroalgae, encrusting invertebrates) cover, as well as grain size of underlying substrate. Northern clingfish seek shelter under rocks of 15-45 cm in size. These rocks have variable Aufwuchs cover, and gravel is the main underlying substrate type. In the intertidal, environmental conditions change with the tides, and for clingfish, the daily time under water (DTUW%) was a key parameter explaining distribution. Rather than location being determined by intertidal zonation, an 80% DTUW, a finer scale concept of tidal inundation, was required by the fish. We expect that this is likely because the mobility of the fish allows them to more closely track the ideal inundation in the marine intertidal.
Synthesis of Poly(Propylene Fumarate)
Kasper, F. Kurtis; Tanahashi, Kazuhiro; Fisher, John P.; Mikos, Antonios G.
2010-01-01
This protocol describes the synthesis of 500 – 4000 Da poly(propylene fumarate) by a two-step reaction of diethyl fumarate and propylene glycol through a bis(hydroxypropyl) fumarate diester intermediate. Purified PPF can be covalently crosslinked to form degradable polymer networks, which have been widely explored for biomedical applications. The properties of crosslinked PPF networks depend upon the molecular properties of the constituent polymer, such as the molecular weight. The purity of the reactants and the exclusion of water from the reaction system are of utmost importance in the generation of high-molecular-weight PPF products. Additionally, the reaction time and temperature influence the molecular weight of the PPF product. The expected time required to complete this protocol is 3 d. PMID:19325548
Workstation-Based Simulation for Rapid Prototyping and Piloted Evaluation of Control System Designs
NASA Technical Reports Server (NTRS)
Mansur, M. Hossein; Colbourne, Jason D.; Chang, Yu-Kuang; Aiken, Edwin W. (Technical Monitor)
1998-01-01
The development and optimization of flight control systems for modem fixed- and rotary-. wing aircraft consume a significant portion of the overall time and cost of aircraft development. Substantial savings can be achieved if the time required to develop and flight test the control system, and the cost, is reduced. To bring about such reductions, software tools such as Matlab/Simulink are being used to readily implement block diagrams and rapidly evaluate the expected responses of the completed system. Moreover, tools such as CONDUIT (CONtrol Designer's Unified InTerface) have been developed that enable the controls engineers to optimize their control laws and ensure that all the relevant quantitative criteria are satisfied, all within a fully interactive, user friendly, unified software environment.
A multilevel probabilistic beam search algorithm for the shortest common supersequence problem.
Gallardo, José E
2012-01-01
The shortest common supersequence problem is a classical problem with many applications in different fields such as planning, Artificial Intelligence and especially in Bioinformatics. Due to its NP-hardness, we can not expect to efficiently solve this problem using conventional exact techniques. This paper presents a heuristic to tackle this problem based on the use at different levels of a probabilistic variant of a classical heuristic known as Beam Search. The proposed algorithm is empirically analysed and compared to current approaches in the literature. Experiments show that it provides better quality solutions in a reasonable time for medium and large instances of the problem. For very large instances, our heuristic also provides better solutions, but required execution times may increase considerably.
Future exploration of Venus (post-Pioneer Venus 1978)
NASA Technical Reports Server (NTRS)
Colin, L.; Evans, L. C.; Greeley, R.; Quaide, W. L.; Schaupp, R. W.; Seiff, A.; Young, R. E.
1976-01-01
A comprehensive study was performed to determine the major scientific unknowns about the planet Venus to be expected in the post-Pioneer Venus 1978 time frame. Based on those results the desirability of future orbiters, atmospheric entry probes, balloons, and landers as vehicles to address the remaining scientific questions were studied. The recommended mission scenario includes a high resolution surface mapping radar orbiter mission for the 1981 launch opportunity, a multiple-lander mission for 1985 and either an atmospheric entry probe or balloon mission in 1988. All the proposed missions can be performed using proposed space shuttle upper stage boosters. Significant amounts of long-lead time supporting research and technology developments are required to be initiated in the near future to permit the recommended launch dates.
Computational problems and signal processing in SETI
NASA Technical Reports Server (NTRS)
Deans, Stanley R.; Cullers, D. K.; Stauduhar, Richard
1991-01-01
The Search for Extraterrestrial Intelligence (SETI), currently being planned at NASA, will require that an enormous amount of data (on the order of 10 exp 11 distinct signal paths for a typical observation) be analyzed in real time by special-purpose hardware. Even though the SETI system design is not based on maximum entropy and Bayesian methods (partly due to the real-time processing constraint), it is expected that enough data will be saved to be able to apply these and other methods off line where computational complexity is not an overriding issue. Interesting computational problems that relate directly to the system design for processing such an enormous amount of data have emerged. Some of these problems are discussed, along with the current status on their solution.
Surveillance Range and Interference Impacts on Self-Separation Performance
NASA Technical Reports Server (NTRS)
Idris, Husni; Consiglio, Maria C.; Wing, David J.
2011-01-01
Self-separation is a concept of flight operations that aims to provide user benefits and increase airspace capacity by transferring traffic separation responsibility from ground-based controllers to the flight crew. Self-separation is enabled by cooperative airborne surveillance, such as that provided by the Automatic Dependent Surveillance-Broadcast (ADSB) system and airborne separation assistance technologies. This paper describes an assessment of the impact of ADS-B system performance on the performance of self-separation as a step towards establishing far-term ADS-B performance requirements. Specifically, the impacts of ADS-B surveillance range and interference limitations were analyzed under different traffic density levels. The analysis was performed using a batch simulation of aircraft performing self-separation assisted by NASA s Autonomous Operations Planner prototype flight-deck tool, in two-dimensional airspace. An aircraft detected conflicts within a look-ahead time of ten minutes and resolved them using strategic closed trajectories or tactical open maneuvers if the time to loss of separation was below a threshold. While a complex interaction was observed between the impacts of surveillance range and interference, as both factors are physically coupled, self-separation performance followed expected trends. An increase in surveillance range resulted in a decrease in the number of conflict detections, an increase in the average conflict detection lead time, and an increase in the percentage of conflict resolutions that were strategic. The majority of the benefit was observed when surveillance range was increased to a value corresponding to the conflict detection look-ahead time. The benefits were attenuated at higher interference levels. Increase in traffic density resulted in a significant increase in the number of conflict detections, as expected, but had no effect on the conflict detection lead time and the percentage of conflict resolutions that were strategic. With surveillance range corresponding to ADS-B minimum operational performance standards for Class A3 equipment and without background interference, a significant portion of conflict resolutions, 97 percent, were achieved in the preferred strategic mode. The majority of conflict resolutions, 71 percent, were strategic even with very high interference (over three times that expected in 2035).
Clustering and correlates of screen-time and eating behaviours among young adolescents.
Pearson, Natalie; Griffiths, Paula; Biddle, Stuart Jh; Johnston, Julie P; McGeorge, Sonia; Haycraft, Emma
2017-05-31
Screen-time and eating behaviours are associated in adolescents, but few studies have examined the clustering of these health behaviours in this age group. The identification of clustered health behaviours, and influences on adolescents' clustered health behaviours, at the time when they are most likely to become habitual, is important for intervention design. The purpose of this study was to assess the prevalence and clustering of health behaviours in adolescents, and examine the sociodemographic, individual, behavioural, and home social and physical environmental correlates of clustered health behaviours. Adolescents aged 11-12 years (n = 527, 48% boys) completed a questionnaire during class-time which assessed screen-time (ST), fruit and vegetable (FV), and energy-dense (ED) snack consumption using a Food Frequency Questionnaire. Health behaviours were categorised into high and low frequencies based on recommendations for FV and ST and median splits for ED snacks. Adolescents reported on their habits, self-efficacy, eating at the television (TV), eating and watching TV together with parents, restrictive parenting practices, and the availability and accessibility of foods within the home. Behavioural clustering was assessed using an observed over expected ratio (O/E). Correlates of clustered behaviours were examined using multivariate multinomial logistic regression. Approximately 70% reported having two or three health risk behaviours. Overall, O/E ratios were close to 1, which indicates clustering. The three risk behaviour combination of low FV, high ED, and high ST occurred more frequently than expected (O/E ratio = 1.06 95% CI 1.01, 1.15. Individual, behavioural, and social and physical home environmental correlates were differentially associated with behavioural clusters. Correlates consistently associated with clusters included eating ED snacks while watching TV, eating at the TV with parents, and the availability and accessibility of ED snack foods within the home. There is a high prevalence of screen time and unhealthy eating, and screen time is coupled with unhealthy dietary behaviours. Strategies and policies are required that simultaneously address reductions in screen time and changes to habitual dietary patterns, such as TV snacking and snack availability and accessibility. These may require a combination of individual, social and environmental changes alongside conscious and more automatic (nudging) strategies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... BUSINESS PROGRAMS Forecasts of Expected Contract Opportunities 2819.7001 General. Section 501 of Public Law 100-656, the Business Opportunity Development Reform Act of 1988, requires executive agencies having... expected contract opportunities, or classes of contract opportunities that small business concerns...
Quantum corrections in thermal states of fermions on anti-de Sitter space-time
NASA Astrophysics Data System (ADS)
Ambruş, Victor E.; Winstanley, Elizabeth
2017-12-01
We study the energy density and pressure of a relativistic thermal gas of massless fermions on four-dimensional Minkowski and anti-de Sitter space-times using relativistic kinetic theory. The corresponding quantum field theory quantities are given by components of the renormalized expectation value of the stress-energy tensor operator acting on a thermal state. On Minkowski space-time, the renormalized vacuum expectation value of the stress-energy tensor is by definition zero, while on anti-de Sitter space-time the vacuum contribution to this expectation value is in general nonzero. We compare the properties of the vacuum and thermal expectation values of the energy density and pressure for massless fermions and discuss the circumstances in which the thermal contribution dominates over the vacuum one.
Zago, Myrka; Bosco, Gianfranco; Maffei, Vincenzo; Iosa, Marco; Ivanenko, Yuri P; Lacquaniti, Francesco
2004-04-01
Prevailing views on how we time the interception of a moving object assume that the visual inputs are informationally sufficient to estimate the time-to-contact from the object's kinematics. Here we present evidence in favor of a different view: the brain makes the best estimate about target motion based on measured kinematics and an a priori guess about the causes of motion. According to this theory, a predictive model is used to extrapolate time-to-contact from expected dynamics (kinetics). We projected a virtual target moving vertically downward on a wide screen with different randomized laws of motion. In the first series of experiments, subjects were asked to intercept this target by punching a real ball that fell hidden behind the screen and arrived in synchrony with the visual target. Subjects systematically timed their motor responses consistent with the assumption of gravity effects on an object's mass, even when the visual target did not accelerate. With training, the gravity model was not switched off but adapted to nonaccelerating targets by shifting the time of motor activation. In the second series of experiments, there was no real ball falling behind the screen. Instead the subjects were required to intercept the visual target by clicking a mousebutton. In this case, subjects timed their responses consistent with the assumption of uniform motion in the absence of forces, even when the target actually accelerated. Overall, the results are in accord with the theory that motor responses evoked by visual kinematics are modulated by a prior of the target dynamics. The prior appears surprisingly resistant to modifications based on performance errors.
Holbein, Christina E; Zebracki, Kathy; Bechtel, Colleen F; Lennon Papadakis, Jaclyn; Franks Bruno, Elizabeth; Holmbeck, Grayson N
2017-03-01
To assess changes over time in parents' expectations of adult milestone achievement (college attendance, full-time job attainment, independent living, marriage, parenthood) for young people with spina bifida, to examine how expectancies relate to actual milestone achievement, and to compare milestone achievement in emerging adults with spina bifida with that of peers with typical development. Sixty-eight families of children with spina bifida (mean age 8y 4mo, 37 males, 31 females) and 68 families of children with typical development (mean age 8y 6mo, 37 males, 31 females) participated at Time 1. At all subsequent timepoints, parents of young people with spina bifida were asked to rate their expectations of emerging adulthood milestone achievement. At Time 7, when participants were 22 to 23 years old, milestone achievement was assessed. Parents of young people with spina bifida lowered their expectations over time for most milestones; parents of children with higher cognitive ability reported decreases of lower magnitude. Parent expectancies were optimistic and unrelated to actual milestone achievement. Emerging adults with spina bifida were less likely than individuals with typical development to achieve all milestones. Optimistic parental expectations may be adaptive for children with spina bifida and their families, although it is important for families to set realistic goals. Healthcare providers serve a key role in helping families of young people with spina bifida prepare for emerging adulthood. © 2016 Mac Keith Press.
Linking melodic expectation to expressive performance timing and perceived musical tension.
Gingras, Bruno; Pearce, Marcus T; Goodchild, Meghan; Dean, Roger T; Wiggins, Geraint; McAdams, Stephen
2016-04-01
This research explored the relations between the predictability of musical structure, expressive timing in performance, and listeners' perceived musical tension. Studies analyzing the influence of expressive timing on listeners' affective responses have been constrained by the fact that, in most pieces, the notated durations limit performers' interpretive freedom. To circumvent this issue, we focused on the unmeasured prelude, a semi-improvisatory genre without notated durations. In Experiment 1, 12 professional harpsichordists recorded an unmeasured prelude on a harpsichord equipped with a MIDI console. Melodic expectation was assessed using a probabilistic model (IDyOM [Information Dynamics of Music]) whose expectations have been previously shown to match closely those of human listeners. Performance timing information was extracted from the MIDI data using a score-performance matching algorithm. Time-series analyses showed that, in a piece with unspecified note durations, the predictability of melodic structure measurably influenced tempo fluctuations in performance. In Experiment 2, another 10 harpsichordists, 20 nonharpsichordist musicians, and 20 nonmusicians listened to the recordings from Experiment 1 and rated the perceived tension continuously. Granger causality analyses were conducted to investigate predictive relations among melodic expectation, expressive timing, and perceived tension. Although melodic expectation, as modeled by IDyOM, modestly predicted perceived tension for all participant groups, neither of its components, information content or entropy, was Granger causal. In contrast, expressive timing was a strong predictor and was Granger causal. However, because melodic expectation was also predictive of expressive timing, our results outline a complete chain of influence from predictability of melodic structure via expressive performance timing to perceived musical tension. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Structural Similitude and Scaling Laws
NASA Technical Reports Server (NTRS)
Simitses, George J.
1998-01-01
Aircraft and spacecraft comprise the class of aerospace structures that require efficiency and wisdom in design, sophistication and accuracy in analysis and numerous and careful experimental evaluations of components and prototype, in order to achieve the necessary system reliability, performance and safety. Preliminary and/or concept design entails the assemblage of system mission requirements, system expected performance and identification of components and their connections as well as of manufacturing and system assembly techniques. This is accomplished through experience based on previous similar designs, and through the possible use of models to simulate the entire system characteristics. Detail design is heavily dependent on information and concepts derived from the previous steps. This information identifies critical design areas which need sophisticated analyses, and design and redesign procedures to achieve the expected component performance. This step may require several independent analysis models, which, in many instances, require component testing. The last step in the design process, before going to production, is the verification of the design. This step necessitates the production of large components and prototypes in order to test component and system analytical predictions and verify strength and performance requirements under the worst loading conditions that the system is expected to encounter in service. Clearly then, full-scale testing is in many cases necessary and always very expensive. In the aircraft industry, in addition to full-scale tests, certification and safety necessitate large component static and dynamic testing. Such tests are extremely difficult, time consuming and definitely absolutely necessary. Clearly, one should not expect that prototype testing will be totally eliminated in the aircraft industry. It is hoped, though, that we can reduce full-scale testing to a minimum. Full-scale large component testing is necessary in other industries as well, Ship building, automobile and railway car construction all rely heavily on testing. Regardless of the application, a scaled-down (by a large factor) model (scale model) which closely represents the structural behavior of the full-scale system (prototype) can prove to be an extremely beneficial tool. This possible development must be based on the existence of certain structural parameters that control the behavior of a structural system when acted upon by static and/or dynamic loads. If such structural parameters exist, a scaled-down replica can be built, which will duplicate the response of the full-scale system. The two systems are then said to be structurally similar. The term, then, that best describes this similarity is structural similitude. Similarity of systems requires that the relevant system parameters be identical and these systems be governed by a unique set of characteristic equations. Thus, if a relation or equation of variables is written for a system, it is valid for all systems which are similar to it. Each variable in a model is proportional to the corresponding variable of the prototype. This ratio, which plays an essential role in predicting the relationship between the model and its prototype, is called the scale factor.
Trust and distrust between patient and doctor.
Hawley, Katherine
2015-10-01
To trust someone is to have expectations of their behaviour; distrust often involves disappointed expectations. But healthy trust and distrust require a good understanding of which expectations are reasonable, and which are not. In this paper, I discuss the limits of trustworthiness by drawing on recent studies of trust in the context of defensive medicine, biobanking and cardiopulmonary resuscitation decisions. © 2015 John Wiley & Sons, Ltd.
Constrained Fisher Scoring for a Mixture of Factor Analyzers
2016-09-01
expectation -maximization algorithm with similar computational requirements. Lastly, we demonstrate the efficacy of the proposed method for learning a... expectation maximization 44 Gene T Whipps 301 394 2372Unclassified Unclassified Unclassified UU ii Approved for public release; distribution is unlimited...14 3.6 Relationship with Expectation -Maximization 16 4. Simulation Examples 16 4.1 Synthetic MFA Example 17 4.2 Manifold Learning Example 22 5
Sheehan, Michael T; Doi, Suhail A R
2016-03-01
Graves' disease is the most common cause of hyperthyroidism and is often managed with radioactive iodine (RAI) therapy. With current dosing schemes, the vast majority of patients develop permanent post-RAI hypothyroidism and are placed on life-long levothyroxine therapy. This hypothyroidism typically occurs within the first 3 to 6 months after RAI therapy is administered. Indeed, patients are typically told to expect life-long thyroid hormone replacement therapy to be required within this timeframe and many providers expect this post-RAI hypothyroidism to be complete and permanent. There is, however, a small subset of patients in whom a transient post-RAI hypothyroidism develops which, initially, presents exactly as the typical permanent hypothyroidism. In some cases the transient hypothyroidism leads to a period of euthyroidism of variable duration eventually progressing to permanent hypothyroidism. In others, persistent hyperthyroidism requires a second dose of RAI. Failure to appreciate and recognize the possibility of transient post-RAI hypothyroidism can delay optimal and appropriate treatment of the patient. We herein describe five cases of transient post-RAI hypothyroidism which highlight this unusual sequence of events. Increased awareness of this possible outcome after RAI for Graves' disease will help in the timely management of patients. © 2016 Marshfield Clinic.
Wang, Tingting; Chen, Yi-Ping Phoebe; Bowman, Phil J; Goddard, Michael E; Hayes, Ben J
2016-09-21
Bayesian mixture models in which the effects of SNP are assumed to come from normal distributions with different variances are attractive for simultaneous genomic prediction and QTL mapping. These models are usually implemented with Monte Carlo Markov Chain (MCMC) sampling, which requires long compute times with large genomic data sets. Here, we present an efficient approach (termed HyB_BR), which is a hybrid of an Expectation-Maximisation algorithm, followed by a limited number of MCMC without the requirement for burn-in. To test prediction accuracy from HyB_BR, dairy cattle and human disease trait data were used. In the dairy cattle data, there were four quantitative traits (milk volume, protein kg, fat% in milk and fertility) measured in 16,214 cattle from two breeds genotyped for 632,002 SNPs. Validation of genomic predictions was in a subset of cattle either from the reference set or in animals from a third breeds that were not in the reference set. In all cases, HyB_BR gave almost identical accuracies to Bayesian mixture models implemented with full MCMC, however computational time was reduced by up to 1/17 of that required by full MCMC. The SNPs with high posterior probability of a non-zero effect were also very similar between full MCMC and HyB_BR, with several known genes affecting milk production in this category, as well as some novel genes. HyB_BR was also applied to seven human diseases with 4890 individuals genotyped for around 300 K SNPs in a case/control design, from the Welcome Trust Case Control Consortium (WTCCC). In this data set, the results demonstrated again that HyB_BR performed as well as Bayesian mixture models with full MCMC for genomic predictions and genetic architecture inference while reducing the computational time from 45 h with full MCMC to 3 h with HyB_BR. The results for quantitative traits in cattle and disease in humans demonstrate that HyB_BR can perform equally well as Bayesian mixture models implemented with full MCMC in terms of prediction accuracy, but with up to 17 times faster than the full MCMC implementations. The HyB_BR algorithm makes simultaneous genomic prediction, QTL mapping and inference of genetic architecture feasible in large genomic data sets.
ERIC Educational Resources Information Center
Jayachandran, Seema; Lleras-Muney, Adriana
2008-01-01
Longer life expectancy should encourage human capital accumulation, since a longer time horizon increases the value of investments that pay out over time. Previous work has been unable to determine the empirical importance of this life-expectancy effect due to the difficulty of isolating it from other effects of health on education. We examine a…
ERIC Educational Resources Information Center
Simones, Lilian Lima
2017-01-01
Music performance in the higher educational context is shaped by a reciprocal chain of interactions between students, part-time tutors and full-time teaching staff, each with specific expectations about the teaching and learning process. Such expectations can provide valuable insights not only for designing and implementing meaningful educational…
A time-dependent probabilistic seismic-hazard model for California
Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.
2000-01-01
For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.
Environmental protection requirements for scout/shuttle auxiliary stages
NASA Technical Reports Server (NTRS)
Qualls, G. L.; Kress, S. S.; Storey, W. W.; Ransdell, P. N.
1980-01-01
The requirements for enabling the Scout upper stages to endure the expected temperature, mechanical shock, acoustical and mechanical vibration environments during a specified shuttle mission were determined. The study consisted of: determining a shuttle mission trajectory for a 545 kilogram (1200 pound) Scout payload; compilation of shuttle environmental conditions; determining of Scout upper stages environments in shuttle missions; compilation of Scout upper stages environmental qualification criteria and comparison to shuttle mission expected environments; and recommendations for enabling Scout upper stages to endure the exptected shuttle mission environments.
NASA Technical Reports Server (NTRS)
Allen, B. Danette
1998-01-01
In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.
Zedelius, Claire M.; Veling, Harm; Aarts, Henk
2012-01-01
Research has shown that high vs. low value rewards improve cognitive task performance independent of whether they are perceived consciously or unconsciously. However, efficient performance in response to high value rewards also depends on whether or not rewards are attainable. This raises the question of whether unconscious reward processing enables people to take into account such attainability information. Building on a theoretical framework according to which conscious reward processing is required to enable higher level cognitive processing, the present research tested the hypothesis that conscious but not unconscious reward processing enables integration of reward value with attainability information. In two behavioral experiments, participants were exposed to mask high and low value coins serving as rewards on a working memory (WM) task. The likelihood for conscious processing was manipulated by presenting the coins relatively briefly (17 ms) or long and clearly visible (300 ms). Crucially, rewards were expected to be attainable or unattainable. Requirements to integrate reward value with attainability information varied across experiments. Results showed that when integration of value and attainability was required (Experiment 1), long reward presentation led to efficient performance, i.e., selectively improved performance for high value attainable rewards. In contrast, in the short presentation condition, performance was increased for high value rewards even when these were unattainable. This difference between the effects of long and short presentation time disappeared when integration of value and attainability information was not required (Experiment 2). Together these findings suggest that unconsciously processed reward information is not integrated with attainability expectancies, causing inefficient effort investment. These findings are discussed in terms of a unique role of consciousness in efficient allocation of effort to cognitive control processes. PMID:22848198
Lifetime Assessment of the NEXT Ion Thruster
NASA Technical Reports Server (NTRS)
VanNoord, Jonathan L.
2010-01-01
Ion thrusters are low thrust, high specific impulse devices with required operational lifetimes on the order of 10,000 to 100,000 hr. The NEXT ion thruster is the latest generation of ion thrusters under development. The NEXT ion thruster currently has a qualification level propellant throughput requirement of 450 kg of xenon, which corresponds to roughly 22,000 hr of operation at the highest throttling point. Currently, a NEXT engineering model ion thruster with prototype model ion optics is undergoing a long duration test to determine wear characteristics and establish propellant throughput capability. The NEXT thruster includes many improvements over previous generations of ion thrusters, but two of its component improvements have a larger effect on thruster lifetime. These include the ion optics with tighter tolerances, a masked region and better gap control, and the discharge cathode keeper material change to graphite. Data from the NEXT 2000 hr wear test, the NEXT long duration test, and further analysis is used to determine the expected lifetime of the NEXT ion thruster. This paper will review the predictions for all of the anticipated failure mechanisms. The mechanisms will include wear of the ion optics and cathode s orifice plate and keeper from the plasma, depletion of low work function material in each cathode s insert, and spalling of material in the discharge chamber leading to arcing. Based on the analysis of the NEXT ion thruster, the first failure mode for operation above a specific impulse of 2000 sec is expected to be the structural failure of the ion optics at 750 kg of propellant throughput, 1.7 times the qualification requirement. An assessment based on mission analyses for operation below a specific impulse of 2000 sec indicates that the NEXT thruster is capable of double the propellant throughput required by these missions.
Climate impacts of oil extraction increase significantly with oilfield age
NASA Astrophysics Data System (ADS)
Masnadi, Mohammad S.; Brandt, Adam R.
2017-08-01
Record-breaking temperatures have induced governments to implement targets for reducing future greenhouse gas (GHG) emissions. Use of oil products contributes ~35% of global GHG emissions, and the oil industry itself consumes 3-4% of global primary energy. Because oil resources are becoming increasingly heterogeneous, requiring different extraction and processing methods, GHG studies should evaluate oil sources using detailed project-specific data. Unfortunately, prior oil-sector GHG analysis has largely neglected the fact that the energy intensity of producing oil can change significantly over the life of a particular oil project. Here we use decades-long time-series data from twenty-five globally significant oil fields (>1 billion barrels ultimate recovery) to model GHG emissions from oil production as a function of time. We find that volumetric oil production declines with depletion, but this depletion is accompanied by significant growth--in some cases over tenfold--in per-MJ GHG emissions. Depletion requires increased energy expenditures in drilling, oil recovery, and oil processing. Using probabilistic simulation, we derive a relationship for estimating GHG increases over time, showing an expected doubling in average emissions over 25 years. These trends have implications for long-term emissions and climate modelling, as well as for climate policy.
Accretion of low-metallicity gas by the Milky Way.
Wakker, B P; Howk, J C; Savage, B D; van Woerden, H; Tufte, S L; Schwarz, U J; Benjamin, R; Reynolds, R J; Peletier, R F; Kalberla, P M
1999-11-25
Models of the chemical evolution of the Milky Way suggest that the observed abundances of elements heavier than helium ('metals') require a continuous infall of gas with metallicity (metal abundance) about 0.1 times the solar value. An infall rate integrated over the entire disk of the Milky Way of approximately 1 solar mass per year can solve the 'G-dwarf problem'--the observational fact that the metallicities of most long-lived stars near the Sun lie in a relatively narrow range. This infall dilutes the enrichment arising from the production of heavy elements in stars, and thereby prevents the metallicity of the interstellar medium from increasing steadily with time. However, in other spiral galaxies, the low-metallicity gas needed to provide this infall has been observed only in associated dwarf galaxies and in the extreme outer disk of the Milky Way. In the distant Universe, low-metallicity hydrogen clouds (known as 'damped Ly alpha absorbers') are sometimes seen near galaxies. Here we report a metallicity of 0.09 times solar for a massive cloud that is falling into the disk of the Milky Way. The mass flow associated with this cloud represents an infall per unit area of about the theoretically expected rate, and approximately 0.1-0.2 times the amount required for the whole Galaxy.
van de Venter, Ec; Oliver, I; Stuart, J M
2015-02-12
Timely outbreak investigations are central in containing communicable disease outbreaks; despite this, no guidance currently exists on expectations of timeliness for investigations. A literature review was conducted to assess the length of epidemiological outbreak investigations in Europe in peer-reviewed publications. We determined time intervals between outbreak declaration to hypothesis generation, and hypothesis generation to availability of results from an analytical study. Outbreaks were classified into two groups: those with a public health impact across regions within a country and requiring national coordination (level 3) and those with a severe or catastrophic impact requiring direction at national level (levels 4 and 5). Investigations in Europe published between 2003 and 2013 were reviewed. We identified 86 papers for review: 63 level 3 and 23 level 4 and 5 investigations. Time intervals were ascertained from 55 papers. The median period for completion of an analytical study was 15 days (range: 4-32) for levels 4 and 5 and 31 days (range: 9-213) for level 3 investigations. Key factors influencing the speed of completing analytical studies were outbreak level, severity of infection and study design. Our findings suggest that guidance for completing analytical studies could usefully be provided, with different time intervals according to outbreak severity.
Analysis of UAS DAA Surveillance in Fast-Time Simulations without DAA Mitigation
NASA Technical Reports Server (NTRS)
Thipphavong, David P.; Santiago, Confesor; Isaacson, David R.; Lee, Seung Man; Refai, Mohamad Said; Snow, James William
2015-01-01
Realization of the expected proliferation of Unmanned Aircraft System (UAS) operations in the National Airspace System (NAS) depends on the development and validation of performance standards for UAS Detect and Avoid (DAA) Systems. The RTCA Special Committee 228 is charged with leading the development of draft Minimum Operational Performance Standards (MOPS) for UAS DAA Systems. NASA, as a participating member of RTCA SC-228 is committed to supporting the development and validation of draft requirements for DAA surveillance system performance. A recent study conducted using NASA's ACES (Airspace Concept Evaluation System) simulation capability begins to address questions surrounding the development of draft MOPS for DAA surveillance systems. ACES simulations were conducted to study the performance of sensor systems proposed by the SC-228 DAA Surveillance sub-group. Analysis included but was not limited to: 1) number of intruders (both IFR and VFR) detected by all sensors as a function of UAS flight time, 2) number of intruders (both IFR and VFR) detected by radar alone as a function of UAS flight time, and 3) number of VFR intruders detected by all sensors as a function of UAS flight time. The results will be used by SC-228 to inform decisions about the surveillance standards of UAS DAA systems and future requirements development and validation efforts.
Arabidopsis plants perform arithmetic division to prevent starvation at night
Scialdone, Antonio; Mugford, Sam T; Feike, Doreen; Skeffington, Alastair; Borrill, Philippa; Graf, Alexander; Smith, Alison M; Howard, Martin
2013-01-01
Photosynthetic starch reserves that accumulate in Arabidopsis leaves during the day decrease approximately linearly with time at night to support metabolism and growth. We find that the rate of decrease is adjusted to accommodate variation in the time of onset of darkness and starch content, such that reserves last almost precisely until dawn. Generation of these dynamics therefore requires an arithmetic division computation between the starch content and expected time to dawn. We introduce two novel chemical kinetic models capable of implementing analog arithmetic division. Predictions from the models are successfully tested in plants perturbed by a night-time light period or by mutations in starch degradation pathways. Our experiments indicate which components of the starch degradation apparatus may be important for appropriate arithmetic division. Our results are potentially relevant for any biological system dependent on a food reserve for survival over a predictable time period. DOI: http://dx.doi.org/10.7554/eLife.00669.001 PMID:23805380
Decision-Making Competence, Social Orientation, Time Style, and Perceived Stress.
Geisler, Martin; Allwood, Carl Martin
2018-01-01
Peoples' decision-making competence, defined as tendency to follow normative rational principles in their decision making, is important as it may influence the extent that requirements are met and levels of perceived stress. In addition, perceived stress could be influenced by social orientation and time style; for example, decisions need to comply with given deadlines and the expectations of others. In two studies, with students ( n = 118) and professionals (police investigators, n = 90), we examined how the three individual difference features: decision-making competence, social orientation, and time approach relate to perceived stress. Results showed that social orientation and time approach were related to levels of perceived stress, but decision-making competence was not. These results indicate that social orientation and time approach are important to consider in relation to perceived stress, but the role of decision-making competence may be less important for perceived stress. However, the role of decision-making competence for perceived stress needs to be further researched.
Decision-Making Competence, Social Orientation, Time Style, and Perceived Stress
Geisler, Martin; Allwood, Carl Martin
2018-01-01
Peoples’ decision-making competence, defined as tendency to follow normative rational principles in their decision making, is important as it may influence the extent that requirements are met and levels of perceived stress. In addition, perceived stress could be influenced by social orientation and time style; for example, decisions need to comply with given deadlines and the expectations of others. In two studies, with students (n = 118) and professionals (police investigators, n = 90), we examined how the three individual difference features: decision-making competence, social orientation, and time approach relate to perceived stress. Results showed that social orientation and time approach were related to levels of perceived stress, but decision-making competence was not. These results indicate that social orientation and time approach are important to consider in relation to perceived stress, but the role of decision-making competence may be less important for perceived stress. However, the role of decision-making competence for perceived stress needs to be further researched. PMID:29686634
Miyamoto, Naoki; Ishikawa, Masayori; Sutherland, Kenneth; Suzuki, Ryusuke; Matsuura, Taeko; Toramatsu, Chie; Takao, Seishin; Nihongi, Hideaki; Shimizu, Shinichi; Umegaki, Kikuo; Shirato, Hiroki
2015-01-01
In the real-time tumor-tracking radiotherapy system, a surrogate fiducial marker inserted in or near the tumor is detected by fluoroscopy to realize respiratory-gated radiotherapy. The imaging dose caused by fluoroscopy should be minimized. In this work, an image processing technique is proposed for tracing a moving marker in low-dose imaging. The proposed tracking technique is a combination of a motion-compensated recursive filter and template pattern matching. The proposed image filter can reduce motion artifacts resulting from the recursive process based on the determination of the region of interest for the next frame according to the current marker position in the fluoroscopic images. The effectiveness of the proposed technique and the expected clinical benefit were examined by phantom experimental studies with actual tumor trajectories generated from clinical patient data. It was demonstrated that the marker motion could be traced in low-dose imaging by applying the proposed algorithm with acceptable registration error and high pattern recognition score in all trajectories, although some trajectories were not able to be tracked with the conventional spatial filters or without image filters. The positional accuracy is expected to be kept within ±2 mm. The total computation time required to determine the marker position is a few milliseconds. The proposed image processing technique is applicable for imaging dose reduction. PMID:25129556
Effect of Detector Dead Time on the Performance of Optical Direct-Detection Communication Links
NASA Technical Reports Server (NTRS)
Chen, C.-C.
1988-01-01
Avalanche photodiodes (APDs) operating in the Geiger mode can provide a significantly improved single-photon detect ion sensitivity over conventional photodiodes. However, the quenching circuit required to remove the excess charge carriers after each photon event can introduce an undesirable dead time into the detection process. The effect of this detector dead time on the performance of a binary pulse-position-modulted (PPM) channel is studied by analyzing the error probability. It is shown that, when back- ground noise is negligible, the performance of the detector with dead time is similar to that o f a quantum-limited receiver. For systems with increasing background intensities, the error rate of the receiver starts to degrade rapidly with increasing dead time. The power penalty due to detector dead time is also evaluated and shown to depend critically on background intensity as well as dead time. Given the expected background strength in an optical channel, therefore, a constraint must be placed on the bandwidth of the receiver to limit the amount of power penalty due to detector dead time.
Effect of detector dead time on the performance of optical direct-detection communication links
NASA Astrophysics Data System (ADS)
Chen, C.-C.
1988-05-01
Avalanche photodiodes (APDs) operating in the Geiger mode can provide a significantly improved single-photon detection sensitivity over conventional photodiodes. However, the quenching circuit required to remove the excess charge carriers after each photon event can introduce an undesirable dead time into the detection process. The effect of this detector dead time on the performance of a binary pulse-position-modulated (PPM) channel is studied by analyzing the error probability. It is shown that, when background noise is negligible, the performance of the detector with dead time is similar to that of a quantum-limited receiver. For systems with increasing background intensities, the error rate of the receiver starts to degrade rapidly with increasing dead time. The power penalty due to detector dead time is also evaluated and shown to depend critically on badkground intensity as well as dead time. Given the expected background strength in an optical channel, therefore, a constraint must be placed on the bandwidth of the receiver to limit the amount of power penalty due to detector dead time.
Rhythmic Effects of Syntax Processing in Music and Language.
Jung, Harim; Sontag, Samuel; Park, YeBin S; Loui, Psyche
2015-01-01
Music and language are human cognitive and neural functions that share many structural similarities. Past theories posit a sharing of neural resources between syntax processing in music and language (Patel, 2003), and a dynamic attention network that governs general temporal processing (Large and Jones, 1999). Both make predictions about music and language processing over time. Experiment 1 of this study investigates the relationship between rhythmic expectancy and musical and linguistic syntax in a reading time paradigm. Stimuli (adapted from Slevc et al., 2009) were sentences broken down into segments; each sentence segment was paired with a musical chord and presented at a fixed inter-onset interval. Linguistic syntax violations appeared in a garden-path design. During the critical region of the garden-path sentence, i.e., the particular segment in which the syntactic unexpectedness was processed, expectancy violations for language, music, and rhythm were each independently manipulated: musical expectation was manipulated by presenting out-of-key chords and rhythmic expectancy was manipulated by perturbing the fixed inter-onset interval such that the sentence segments and musical chords appeared either early or late. Reading times were recorded for each sentence segment and compared for linguistic, musical, and rhythmic expectancy. Results showed main effects of rhythmic expectancy and linguistic syntax expectancy on reading time. There was also an effect of rhythm on the interaction between musical and linguistic syntax: effects of violations in musical and linguistic syntax showed significant interaction only during rhythmically expected trials. To test the effects of our experimental design on rhythmic and linguistic expectancies, independently of musical syntax, Experiment 2 used the same experimental paradigm, but the musical factor was eliminated-linguistic stimuli were simply presented silently, and rhythmic expectancy was manipulated at the critical region. Experiment 2 replicated effects of rhythm and language, without an interaction. Together, results suggest that the interaction of music and language syntax processing depends on rhythmic expectancy, and support a merging of theories of music and language syntax processing with dynamic models of attentional entrainment.
Witjes, Suzanne; van Geenen, Rutger C I; Koenraadt, Koen L M; van der Hart, Cor P; Blankevoort, Leendert; Kerkhoffs, Gino M M J; Kuijer, P Paul F M
2017-02-01
Indications for total and unicondylar knee arthroplasty (KA) have expanded to younger patients, in which Patient-Reported Outcome Measures (PROMs) often show ceiling effects. This might be due to higher expectations. Our aims were to explore expectations of younger patients concerning activities in daily life, work and leisure time after KA and to assess to what extent PROMs meet and evaluate these activities of importance. Focus groups were performed among osteoarthritis (OA) patients <65 years awaiting KA, in which they indicated what activities they expected to perform better in daily life, work and leisure time after KA. Additionally, 28 activities of daily life, 17 of work and 27 of leisure time were depicted from seven PROMS, which were rated on importance, frequency and bother. A total score, representing motivation for surgery, was also calculated. Data saturation was reached after six focus groups including 37 patients. Younger OA patients expect to perform better on 16 activities after KA, including high-impact leisure time activities. From the PROMs, daily life and work activities were rated high in both importance and motivation for surgery, but for leisure time activities importance varied highly between patients. All seven PROMs score activities of importance, but no single PROM incorporates all activities rated important. Younger patients expect to perform better on many activities of daily life, work and leisure time after KA, and often at demanding levels. To measure outcomes of younger patients, we suggest using PROMs that include work and leisure time activities besides daily life activities, in which preferably scored activities can be individualized.