ERIC Educational Resources Information Center
Nolan, Lisa A.
2016-01-01
When considering closing the achievement gap, full-day kindergarten (FDK) is a viable contender. The implementation of specific teacher strategies enhances the FDK experience and elicits gains among the students. The literature clearly articulates a strong correlation between poverty and poor achievement and supports the notion that the…
ERIC Educational Resources Information Center
Feuerborn, Laura L.; Tyre, Ashli D.; King, Joe P.
2015-01-01
The practices of schoolwide positive behavior support (SWPBS) are dependent on staff implementation in classroom and common areas throughout the school. Thus, gaining the support and commitment of school staff is a critical step toward reaching full implementation of SWPBS. However, achieving buildingwide support can be challenging; many schools…
ERIC Educational Resources Information Center
Vandyke, Barbara Adrienne
2009-01-01
For too long, educators have been left to their own devices when implementing educational policies, initiatives, strategies, and interventions, and they have longed to see the full benefits of these programs, especially in reading achievement. However, instead of determining whether a policy/initiative is working, educators have been asked to…
ERIC Educational Resources Information Center
Duggan, Terri, Ed.; Holmes, Madelyn, Ed.
This report highlights results from the 1999 Wingspread Conference on improving student achievement, a gathering of educators, leaders, and policymakers that opened a dialogue about barriers to full implementation of high standards for all students. Participants discussed five papers that examined these issues from top to bottom--from the…
Integrated System Health Management (ISHM): Systematic Capability Implementation
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Holland, Randy; Schmalzwel, John; Duncavage, Dan
2006-01-01
This paper provides a credible approach for implementation of ISHM capability in any system. The requirements and processes to implement ISHM capability are unique in that a credible capability is initially implemented at a low level, and it evolves to achieve higher levels by incremental augmentation. In contrast, typical capabilities, such as thrust of an engine, are implemented once at full Functional Capability Level (FCL), which is not designed to change during the life of the product. The approach will describe core ingredients (e.g. technologies, architectures, etc.) and when and how ISHM capabilities may be implemented. A specific architecture/taxonomy/ontology will be described, as well as a prototype software environment that supports development of ISHM capability. This paper will address implementation of system-wide ISHM as a core capability, and ISHM for specific subsystems as expansions and evolution, but always focusing on achieving an integrated capability.
GAO, L.; HAGEN, N.; TKACZYK, T.S.
2012-01-01
Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127
Lagoo, Janaka; Lopushinsky, Steven R; Haynes, Alex B; Bain, Paul; Flageole, Helene; Skarsgard, Erik D; Brindle, Mary E
2017-01-01
Objective To examine the effectiveness and meaningful use of paediatric surgical safety checklists (SSCs) and their implementation strategies through a systematic review with narrative synthesis. Summary background data Since the launch of the WHO SSC, checklists have been integrated into surgical systems worldwide. Information is sparse on how SSCs have been integrated into the paediatric surgical environment. Methods A broad search strategy was created using Pubmed, Embase, CINAHL, Cochrane Central, Web of Science, Science Citation Index and Conference Proceedings Citation Index. Abstracts and full texts were screened independently, in duplicate for inclusion. Extracted study characteristic and outcomes generated themes explored through subgroup analyses and idea webbing. Results 1826 of 1921 studies were excluded after title and abstract review (kappa 0.77) and 47 after full-text review (kappa 0.86). 20 studies were of sufficient quality for narrative synthesis. Clinical outcomes were not affected by SSC introduction in studies without implementation strategies. A comprehensive SSC implementation strategy in developing countries demonstrated improved outcomes in high-risk surgeries. Narrative synthesis suggests that meaningful compliance is inconsistently measured and rarely achieved. Strategies involving feedback improved compliance. Stakeholder-developed implementation strategies, including team-based education, achieved greater acceptance. Three studies suggest that parental involvement in the SSC is valued by parents, nurses and physicians and may improve patient safety. Conclusions A SSC implementation strategy focused on paediatric patients and their families can achieve high acceptability and good compliance. SSCs’ role in improving measures of paediatric surgical outcome is not well established, but they may be effective when used within a comprehensive implementation strategy especially for high-risk patients in low-resource settings. PMID:29042377
Ahmad, Peer Zahoor; Quadri, S M K; Ahmad, Firdous; Bahar, Ali Newaz; Wani, Ghulam Mohammad; Tantary, Shafiq Maqbool
2017-12-01
Quantum-dot cellular automata, is an extremely small size and a powerless nanotechnology. It is the possible alternative to current CMOS technology. Reversible QCA logic is the most important issue at present time to reduce power losses. This paper presents a novel reversible logic gate called the F-Gate. It is simplest in design and a powerful technique to implement reversible logic. A systematic approach has been used to implement a novel single layer reversible Full-Adder, Full-Subtractor and a Full Adder-Subtractor using the F-Gate. The proposed Full Adder-Subtractor has achieved significant improvements in terms of overall circuit parameters among the most previously cost-efficient designs that exploit the inevitable nano-level issues to perform arithmetic computing. The proposed designs have been authenticated and simulated using QCADesigner tool ver. 2.0.3.
Ford, Eric W; Menachemi, Nir; Huerta, Timothy R; Yu, Feliciano
2010-01-01
Health systems are facing significant pressure to either implement health information technology (HIT) systems that have "certified" electronic health record applications and that fulfill the federal government's definition of "meaningful use" or risk substantial financial penalties in the near future. To this end, hospitals have adopted one of three strategies, described as "best of breed," "best of suite," and "single vendor," to meet organizational and regulatory demands. The single-vendor strategy is used by the simple majority of U.S. hospitals, but is it the most effective mode for achieving full implementation? Moreover, what are the implications of adopting this strategy for achieving meaningful use? The simple answer to the first question is that the hospitals using the hybrid best of suite strategy had fully implemented HIT systems in significantly greater proportions than did hospitals employing either of the other strategies. Nonprofit and system-affiliated hospitals were more likely to have fully implemented their HIT systems. In addition, increased health maintenance organization market penetration rates were positively correlated with complete implementation rates. These results have ongoing implications for achieving meaningful use in the near term. The federal government's rewards and incentives program related to the meaningful use of HIT in hospitals has created an organizational imperative to implement such systems. For hospitals that have not begun systemwide implementation, pursuing a best of suite strategy may provide the greatest chance for achieving all or some of the meaningful use targets in the near term or at least avoiding future penalties scheduled to begin in 2015.
Kampman, Margitta T; Eltoft, Agnethe; Karaliute, Migle; Børvik, Margrethe T; Nilssen, Hugo; Rasmussen, Ida; Johnsen, Stein H
2015-10-01
In patients with acute stroke, undernutrition and aspiration pneumonia are associated with increased mortality and length of hospital stay. Formal screening for nutritional risk and dysphagia helps to ensure optimal nutritional management in all patients with stroke and to reduce the risk of aspiration in patients with dysphagia. We developed a national guideline for nutritional and dysphagia screening in acute stroke, which was introduced in our stroke unit on June 1, 2012. The primary objective was to audit adherence to the guideline and to achieve full implementation. Second, we assessed the prevalence of nutritional risk and dysphagia. We performed a chart review to assess performance of screening for nutritional risk and dysphagia in all patients with stroke hospitalized for ≥48 hours between June 1, 2012, and May 31, 2013. Next we applied a "clinical microsystems approach" with rapid improvement cycles and audits over a 6-month period to achieve full implementation. The chart review showed that nutritional risk screening was performed in 65% and swallow testing in 91% of eligible patients (n = 185). Proactive implementation resulted in >95% patients screened (n = 79). The overall prevalence of nutritional risk was 29%, and 23% of the patients failed the initial swallow test. Proactive implementation is required to obtain high screening rates for nutritional risk and swallowing difficulties using validated screening tools. The proportion of patients at nutritional risk and the prevalence of dysphagia at initial swallow test were in the lower range of previous reports.
Improving X-Ray Optics via Differential Deposition
NASA Technical Reports Server (NTRS)
Kilaru, Kiranmayee; Ramsey, Brian D.; Atkins, Carolyn
2017-01-01
Differential deposition, a post-fabrication figure correction technique, has the potential to significantly improve the imaging quality of grazing-incidence X-ray optics. DC magnetron sputtering is used to selectively coat the mirror in order to minimize the figure deviations. Custom vacuum chambers have been developed at NASA MSFC that will enable the implementation of the deposition on X-ray optics. A factor of two improvement has been achieved in the angular resolution of the full-shell X-ray optics with first stage correction of differential deposition. Current efforts are focused on achieving higher improvements through efficient implementation of differential deposition.
Shek, Daniel T L; Ma, Cecilia M S
2012-01-17
The present study was conducted to explore the implementation quality of the Secondary 3 Program of the Tier 1 Program of Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in the third year of the Full Implementation Phase. Classroom observations of 182 units in 129 schools were conducted. Results showed that the overall level of program adherence was 73.9%. Thirteen aspects concerning program delivery were significantly correlated. Multiple regression analyses revealed that overall implementation quality was significantly predicted by student participation and involvement, strategies to enhance student motivation, use of positive and supportive feedback, degree of achievement of the objectives, and lesson preparation. Success of implementation was significantly predicted by student participation and involvement, classroom control, use of positive and supportive feedback, opportunity for reflection, degree of achievement of the objectives and time management. The present findings generally suggest that the implementation quality of Project P.A.T.H.S. was high.
ERIC Educational Resources Information Center
Johnson, Joseph Hamilton
2012-01-01
The Full Service Schools (FSS) reform model is an inter-agency collaboration between the District of Columbia Public Schools (DCPS), Choices, Inc., Insights Education Group and the DC Department of Mental Health. This comprehensive school reform model is based in the Response to Intervention paradigm and is designed to mitigate student academic…
Komorkiewicz, Mateusz; Kryjak, Tomasz; Gorgon, Marek
2014-01-01
This article presents an efficient hardware implementation of the Horn-Schunck algorithm that can be used in an embedded optical flow sensor. An architecture is proposed, that realises the iterative Horn-Schunck algorithm in a pipelined manner. This modification allows to achieve data throughput of 175 MPixels/s and makes processing of Full HD video stream (1, 920 × 1, 080 @ 60 fps) possible. The structure of the optical flow module as well as pre- and post-filtering blocks and a flow reliability computation unit is described in details. Three versions of optical flow modules, with different numerical precision, working frequency and obtained results accuracy are proposed. The errors caused by switching from floating- to fixed-point computations are also evaluated. The described architecture was tested on popular sequences from an optical flow dataset of the Middlebury University. It achieves state-of-the-art results among hardware implementations of single scale methods. The designed fixed-point architecture achieves performance of 418 GOPS with power efficiency of 34 GOPS/W. The proposed floating-point module achieves 103 GFLOPS, with power efficiency of 24 GFLOPS/W. Moreover, a 100 times speedup compared to a modern CPU with SIMD support is reported. A complete, working vision system realized on Xilinx VC707 evaluation board is also presented. It is able to compute optical flow for Full HD video stream received from an HDMI camera in real-time. The obtained results prove that FPGA devices are an ideal platform for embedded vision systems. PMID:24526303
Komorkiewicz, Mateusz; Kryjak, Tomasz; Gorgon, Marek
2014-02-12
This article presents an efficient hardware implementation of the Horn-Schunck algorithm that can be used in an embedded optical flow sensor. An architecture is proposed, that realises the iterative Horn-Schunck algorithm in a pipelined manner. This modification allows to achieve data throughput of 175 MPixels/s and makes processing of Full HD video stream (1; 920 × 1; 080 @ 60 fps) possible. The structure of the optical flow module as well as pre- and post-filtering blocks and a flow reliability computation unit is described in details. Three versions of optical flow modules, with different numerical precision, working frequency and obtained results accuracy are proposed. The errors caused by switching from floating- to fixed-point computations are also evaluated. The described architecture was tested on popular sequences from an optical flow dataset of the Middlebury University. It achieves state-of-the-art results among hardware implementations of single scale methods. The designed fixed-point architecture achieves performance of 418 GOPS with power efficiency of 34 GOPS/W. The proposed floating-point module achieves 103 GFLOPS, with power efficiency of 24 GFLOPS/W. Moreover, a 100 times speedup compared to a modern CPU with SIMD support is reported. A complete, working vision system realized on Xilinx VC707 evaluation board is also presented. It is able to compute optical flow for Full HD video stream received from an HDMI camera in real-time. The obtained results prove that FPGA devices are an ideal platform for embedded vision systems.
Bomble, L; Lavorel, B; Remacle, F; Desouter-Lecomte, M
2008-05-21
Following the scheme recently proposed by Remacle and Levine [Phys. Rev. A 73, 033820 (2006)], we investigate the concrete implementation of a classical full adder on two electronic states (X 1A1 and C 1B2) of the SO2 molecule by optical pump-probe laser pulses using intuitive and counterintuitive (stimulated Raman adiabatic passage) excitation schemes. The resources needed for providing the inputs and reading out are discussed, as well as the conditions for achieving robustness in both the intuitive and counterintuitive pump-dump sequences. The fidelity of the scheme is analyzed with respect to experimental noise and two kinds of perturbations: The coupling to the neighboring rovibrational states and a finite rotational temperature that leads to a mixture for the initial state. It is shown that the logic processing of a full addition cycle can be realistically experimentally implemented on a picosecond time scale while the readout takes a few nanoseconds.
USAID Adolescent Girl Strategy Implementation Plan
ERIC Educational Resources Information Center
US Agency for International Development, 2016
2016-01-01
USAID's commitment to empowering adolescent girls to reach their full potential is reflected in the Agency's larger efforts to achieve gender equality and women's empowerment. The Agency holds decades of experience leading advances for greater gender equality and empowerment that benefit adolescent girls; however, these activities have not been…
Efficient High Performance Collective Communication for Distributed Memory Environments
ERIC Educational Resources Information Center
Ali, Qasim
2009-01-01
Collective communication allows efficient communication and synchronization among a collection of processes, unlike point-to-point communication that only involves a pair of communicating processes. Achieving high performance for both kernels and full-scale applications running on a distributed memory system requires an efficient implementation of…
Journey toward Holistic Instruction: Supporting Teachers' Growth.
ERIC Educational Resources Information Center
Au, Kathryn H.; Scheu, Judith A.
1996-01-01
Discusses how to support teachers through the challenges of full implementation of holistic instruction in the classroom. Examines the effectiveness of the Kamehameha Elementary Education Program in Hawaii. Finds that holistic instruction can be effective in improving the literacy achievement of students of diverse cultural and linguistic…
Estimating the Cost of Providing Foundational Public Health Services.
Mamaril, Cezar Brian C; Mays, Glen P; Branham, Douglas Keith; Bekemeier, Betty; Marlowe, Justin; Timsina, Lava
2017-12-28
To estimate the cost of resources required to implement a set of Foundational Public Health Services (FPHS) as recommended by the Institute of Medicine. A stochastic simulation model was used to generate probability distributions of input and output costs across 11 FPHS domains. We used an implementation attainment scale to estimate costs of fully implementing FPHS. We use data collected from a diverse cohort of 19 public health agencies located in three states that implemented the FPHS cost estimation methodology in their agencies during 2014-2015. The average agency incurred costs of $48 per capita implementing FPHS at their current attainment levels with a coefficient of variation (CV) of 16 percent. Achieving full FPHS implementation would require $82 per capita (CV=19 percent), indicating an estimated resource gap of $34 per capita. Substantial variation in costs exists across communities in resources currently devoted to implementing FPHS, with even larger variation in resources needed for full attainment. Reducing geographic inequities in FPHS may require novel financing mechanisms and delivery models that allow health agencies to have robust roles within the health system and realize a minimum package of public health services for the nation. © Health Research and Educational Trust.
Completing advance directives for health care decisions: getting to yes.
Shewchuk, T R
1998-09-01
The concept of advance directives for health care decision making has been judicially condoned, legislatively promoted, and systematically implemented by health care institutions, yet the execution rate of advance directives remains low. Physicians should discuss with their patients advance care planning generally and end-of-life issues specifically, preferably when patients are in good health and not when they face an acute medical crisis. The physician-hospital relationship poses particular challenges for the optimal implementation of advance directives that must be addressed. Hospital administrators must improve education of patients and physicians on the value of such documents as well as internal mechanisms to ensure better implementation of directives. Health insurance plans may be better able to ensure optimal gathering and implementation of directives. Patients must become more familiar and more comfortable with advance care planning and the reality of death and dying issues. Full acceptance of the value of directives ultimately rests on achieving full participation of all involved--providers, patients, families, and payors--in this most profound process.
A real time microcomputer implementation of sensor failure detection for turbofan engines
NASA Technical Reports Server (NTRS)
Delaat, John C.; Merrill, Walter C.
1989-01-01
An algorithm was developed which detects, isolates, and accommodates sensor failures using analytical redundancy. The performance of this algorithm was demonstrated on a full-scale F100 turbofan engine. The algorithm was implemented in real-time on a microprocessor-based controls computer which includes parallel processing and high order language programming. Parallel processing was used to achieve the required computational power for the real-time implementation. High order language programming was used in order to reduce the programming and maintenance costs of the algorithm implementation software. The sensor failure algorithm was combined with an existing multivariable control algorithm to give a complete control implementation with sensor analytical redundancy. The real-time microprocessor implementation of the algorithm which resulted in the successful completion of the algorithm engine demonstration, is described.
Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks.
Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue
2017-06-06
Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions.
Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks
Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue
2017-01-01
Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions. PMID:28587304
A 20 MHz CMOS reorder buffer for a superscalar microprocessor
NASA Technical Reports Server (NTRS)
Lenell, John; Wallace, Steve; Bagherzadeh, Nader
1992-01-01
Superscalar processors can achieve increased performance by issuing instructions out-of-order from the original sequential instruction stream. Implementing an out-of-order instruction issue policy requires a hardware mechanism to prevent incorrectly executed instructions from updating register values. A reorder buffer can be used to allow a superscalar processor to issue instructions out-of-order and maintain program correctness. This paper describes the design and implementation of a 20MHz CMOS reorder buffer for superscalar processors. The reorder buffer is designed to accept and retire two instructions per cycle. A full-custom layout in 1.2 micron has been implemented, measuring 1.1058 mm by 1.3542 mm.
Formby, Craig; Hawley, Monica L.; Sherlock, LaGuinn P.; Gold, Susan; Payne, JoAnne; Brooks, Rebecca; Parton, Jason M.; Juneau, Roger; Desporte, Edward J.; Siegle, Gregory R.
2015-01-01
The primary aim of this research was to evaluate the validity, efficacy, and generalization of principles underlying a sound therapy–based treatment for promoting expansion of the auditory dynamic range (DR) for loudness. The basic sound therapy principles, originally devised for treatment of hyperacusis among patients with tinnitus, were evaluated in this study in a target sample of unsuccessfully fit and/or problematic prospective hearing aid users with diminished DRs (owing to their elevated audiometric thresholds and reduced sound tolerance). Secondary aims included: (1) delineation of the treatment contributions from the counseling and sound therapy components to the full-treatment protocol and, in turn, the isolated treatment effects from each of these individual components to intervention success; and (2) characterization of the respective dynamics for full, partial, and control treatments. Thirty-six participants with bilateral sensorineural hearing losses and reduced DRs, which affected their actual or perceived ability to use hearing aids, were enrolled in and completed a placebo-controlled (for sound therapy) randomized clinical trial. The 2 × 2 factorial trial design was implemented with or without various assignments of counseling and sound therapy. Specifically, participants were assigned randomly to one of four treatment groups (nine participants per group), including: (1) group 1—full treatment achieved with scripted counseling plus sound therapy implemented with binaural sound generators; (2) group 2—partial treatment achieved with counseling and placebo sound generators (PSGs); (3) group 3—partial treatment achieved with binaural sound generators alone; and (4) group 4—a neutral control treatment implemented with the PSGs alone. Repeated measurements of categorical loudness judgments served as the primary outcome measure. The full-treatment categorical-loudness judgments for group 1, measured at treatment termination, were significantly greater than the corresponding pretreatment judgments measured at baseline at 500, 2,000, and 4,000 Hz. Moreover, increases in their “uncomfortably loud” judgments (∼12 dB over the range from 500 to 4,000 Hz) were superior to those measured for either of the partial-treatment groups 2 and 3 or for control group 4. Efficacy, assessed by treatment-related criterion increases ≥ 10 dB for judgments of uncomfortable loudness, was superior for full treatment (82% efficacy) compared with that for either of the partial treatments (25% and 40% for counseling combined with the placebo sound therapy and sound therapy alone, respectively) or for the control treatment (50%). The majority of the group 1 participants achieved their criterion improvements within 3 months of beginning treatment. The treatment effect from sound therapy was much greater than that for counseling, which was statistically indistinguishable in most of our analyses from the control treatment. The basic principles underlying the full-treatment protocol are valid and have general applicability for expanding the DR among individuals with sensorineural hearing losses, who may often report aided loudness problems. The positive full-treatment effects were superior to those achieved for either counseling or sound therapy in virtual or actual isolation, respectively; however, the delivery of both components in the full-treatment approach was essential for an optimum treatment outcome. PMID:27516711
Formby, Craig; Hawley, Monica L; Sherlock, LaGuinn P; Gold, Susan; Payne, JoAnne; Brooks, Rebecca; Parton, Jason M; Juneau, Roger; Desporte, Edward J; Siegle, Gregory R
2015-05-01
The primary aim of this research was to evaluate the validity, efficacy, and generalization of principles underlying a sound therapy-based treatment for promoting expansion of the auditory dynamic range (DR) for loudness. The basic sound therapy principles, originally devised for treatment of hyperacusis among patients with tinnitus, were evaluated in this study in a target sample of unsuccessfully fit and/or problematic prospective hearing aid users with diminished DRs (owing to their elevated audiometric thresholds and reduced sound tolerance). Secondary aims included: (1) delineation of the treatment contributions from the counseling and sound therapy components to the full-treatment protocol and, in turn, the isolated treatment effects from each of these individual components to intervention success; and (2) characterization of the respective dynamics for full, partial, and control treatments. Thirty-six participants with bilateral sensorineural hearing losses and reduced DRs, which affected their actual or perceived ability to use hearing aids, were enrolled in and completed a placebo-controlled (for sound therapy) randomized clinical trial. The 2 × 2 factorial trial design was implemented with or without various assignments of counseling and sound therapy. Specifically, participants were assigned randomly to one of four treatment groups (nine participants per group), including: (1) group 1-full treatment achieved with scripted counseling plus sound therapy implemented with binaural sound generators; (2) group 2-partial treatment achieved with counseling and placebo sound generators (PSGs); (3) group 3-partial treatment achieved with binaural sound generators alone; and (4) group 4-a neutral control treatment implemented with the PSGs alone. Repeated measurements of categorical loudness judgments served as the primary outcome measure. The full-treatment categorical-loudness judgments for group 1, measured at treatment termination, were significantly greater than the corresponding pretreatment judgments measured at baseline at 500, 2,000, and 4,000 Hz. Moreover, increases in their "uncomfortably loud" judgments (∼12 dB over the range from 500 to 4,000 Hz) were superior to those measured for either of the partial-treatment groups 2 and 3 or for control group 4. Efficacy, assessed by treatment-related criterion increases ≥ 10 dB for judgments of uncomfortable loudness, was superior for full treatment (82% efficacy) compared with that for either of the partial treatments (25% and 40% for counseling combined with the placebo sound therapy and sound therapy alone, respectively) or for the control treatment (50%). The majority of the group 1 participants achieved their criterion improvements within 3 months of beginning treatment. The treatment effect from sound therapy was much greater than that for counseling, which was statistically indistinguishable in most of our analyses from the control treatment. The basic principles underlying the full-treatment protocol are valid and have general applicability for expanding the DR among individuals with sensorineural hearing losses, who may often report aided loudness problems. The positive full-treatment effects were superior to those achieved for either counseling or sound therapy in virtual or actual isolation, respectively; however, the delivery of both components in the full-treatment approach was essential for an optimum treatment outcome.
Total quality through computer integrated manufacturing in the pharmaceutical industry.
Ufret, C M
1995-01-01
The role of Computer Integrated Manufacturing (CIM) in the pursue of total quality in pharmaceutical manufacturing is assessed. CIM key objectives, design criteria, and performance measurements, in addition to its scope and implementation in a hierarchical structure, are explored in detail. Key elements for the success of each phase in a CIM project and a brief status of current CIM implementations in the pharmaceutical industry are presented. The role of World Class Manufacturing performance standards and other key issues to achieve full CIM benefits are also addressed.
75 FR 39493 - United States Patent and Trademark Office Draft Strategic Plan for FY 2010-2015
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-09
... plan includes the USPTO's mission statement, vision statement and a description of the strategic goals... achieve its vision. Full details on how the USPTO plans to implement the strategic plan, including funding...] United States Patent and Trademark Office Draft Strategic Plan for FY 2010-2015 AGENCY: United States...
ERIC Educational Resources Information Center
Yang, Ya-Ting Carolyn
2012-01-01
This study investigates the effectiveness digital game-based learning (DGBL) on students' problem solving, learning motivation, and academic achievement. In order to provide substantive empirical evidence, a quasi-experimental design was implemented over the course of a full semester (23 weeks). Two ninth-grade Civics and Society classes, with a…
NASA Astrophysics Data System (ADS)
Perton, Mathieu; Contreras-Zazueta, Marcial A.; Sánchez-Sesma, Francisco J.
2016-06-01
A new implementation of indirect boundary element method allows simulating the elastic wave propagation in complex configurations made of embedded regions that are homogeneous with irregular boundaries or flat layers. In an older implementation, each layer of a flat layered region would have been treated as a separated homogeneous region without taking into account the flat boundary information. For both types of regions, the scattered field results from fictitious sources positioned along their boundaries. For the homogeneous regions, the fictitious sources emit as in a full-space and the wave field is given by analytical Green's functions. For flat layered regions, fictitious sources emit as in an unbounded flat layered region and the wave field is given by Green's functions obtained from the discrete wavenumber (DWN) method. The new implementation allows then reducing the length of the discretized boundaries but DWN Green's functions require much more computation time than the full-space Green's functions. Several optimization steps are then implemented and commented. Validations are presented for 2-D and 3-D problems. Higher efficiency is achieved in 3-D.
Acceleration for 2D time-domain elastic full waveform inversion using a single GPU card
NASA Astrophysics Data System (ADS)
Jiang, Jinpeng; Zhu, Peimin
2018-05-01
Full waveform inversion (FWI) is a challenging procedure due to the high computational cost related to the modeling, especially for the elastic case. The graphics processing unit (GPU) has become a popular device for the high-performance computing (HPC). To reduce the long computation time, we design and implement the GPU-based 2D elastic FWI (EFWI) in time domain using a single GPU card. We parallelize the forward modeling and gradient calculations using the CUDA programming language. To overcome the limitation of relatively small global memory on GPU, the boundary saving strategy is exploited to reconstruct the forward wavefield. Moreover, the L-BFGS optimization method used in the inversion increases the convergence of the misfit function. A multiscale inversion strategy is performed in the workflow to obtain the accurate inversion results. In our tests, the GPU-based implementations using a single GPU device achieve >15 times speedup in forward modeling, and about 12 times speedup in gradient calculation, compared with the eight-core CPU implementations optimized by OpenMP. The test results from the GPU implementations are verified to have enough accuracy by comparing the results obtained from the CPU implementations.
Simplifying HL7 Version 3 messages.
Worden, Robert; Scott, Philip
2011-01-01
HL7 Version 3 offers a semantically robust method for healthcare interoperability but has been criticized as overly complex to implement. This paper reviews initiatives to simplify HL7 Version 3 messaging and presents a novel approach based on semantic mapping. Based on user-defined definitions, precise transforms between simple and full messages are automatically generated. Systems can be interfaced with the simple messages and achieve interoperability with full Version 3 messages through the transforms. This reduces the costs of HL7 interfacing and will encourage better uptake of HL7 Version 3 and CDA.
A methodology based on reduced complexity algorithm for system applications using microprocessors
NASA Technical Reports Server (NTRS)
Yan, T. Y.; Yao, K.
1988-01-01
The paper considers a methodology on the analysis and design of a minimum mean-square error criterion linear system incorporating a tapped delay line (TDL) where all the full-precision multiplications in the TDL are constrained to be powers of two. A linear equalizer based on the dispersive and additive noise channel is presented. This microprocessor implementation with optimized power of two TDL coefficients achieves a system performance comparable to the optimum linear equalization with full-precision multiplications for an input data rate of 300 baud.
A GPU-paralleled implementation of an enhanced face recognition algorithm
NASA Astrophysics Data System (ADS)
Chen, Hao; Liu, Xiyang; Shao, Shuai; Zan, Jiguo
2013-03-01
Face recognition algorithm based on compressed sensing and sparse representation is hotly argued in these years. The scheme of this algorithm increases recognition rate as well as anti-noise capability. However, the computational cost is expensive and has become a main restricting factor for real world applications. In this paper, we introduce a GPU-accelerated hybrid variant of face recognition algorithm named parallel face recognition algorithm (pFRA). We describe here how to carry out parallel optimization design to take full advantage of many-core structure of a GPU. The pFRA is tested and compared with several other implementations under different data sample size. Finally, Our pFRA, implemented with NVIDIA GPU and Computer Unified Device Architecture (CUDA) programming model, achieves a significant speedup over the traditional CPU implementations.
The Value of Full Correction: Achieving Excellent and Affordable Results.
Kaplan, Julie Bass
2016-01-01
Patients often come to medical aesthetic offices with hopes to fully correct lost facial volume and achieve a natural appearance. Unfortunately, the cost per syringe of dermal filler can be a barrier to desired outcomes. Many aesthetic practitioners do the best they can with the amount of product the patient can afford, often falling short of the "wow" effect for the patient. This article describes what one office implemented to solve the conundrum of affordability while still allowing offices to cover its own financial realities. This tool can help patients achieve beautiful, natural, and affordable outcomes while helping offices advance in manufacturer's tiers, improve word-of-mouth advertising, and increase job satisfaction.
2002-03-01
food and beverages , customer service, parking, taxi services, advertising, and general aviation services needed by the airports’ users. The...Authority to implement, through the use of published competitive procedures, procurement and concession franchising systems designed to achieve full and...the Authority issued guidance for awarding its contracts and concession franchises in 1993, the guidance does not adequately reflect competitive
A survey of the state of the art and focused research in range systems, task 2
NASA Technical Reports Server (NTRS)
Yao, K.
1986-01-01
Many communication, control, and information processing subsystems are modeled by linear systems incorporating tapped delay lines (TDL). Such optimized subsystems result in full precision multiplications in the TDL. In order to reduce complexity and cost in a microprocessor implementation, these multiplications can be replaced by single-shift instructions which are equivalent to powers of two multiplications. Since, in general, the obvious operation of rounding the infinite precision TDL coefficients to the nearest powers of two usually yield quite poor system performance, the optimum powers of two coefficient solution was considered. Detailed explanations on the use of branch-and-bound algorithms for finding the optimum powers of two solutions are given. Specific demonstration of this methodology to the design of a linear data equalizer and its implementation in assembly language on a 8080 microprocessor with a 12 bit A/D converter are reported. This simple microprocessor implementation with optimized TDL coefficients achieves a system performance comparable to the optimum linear equalization with full precision multiplications for an input data rate of 300 baud. The philosophy demonstrated in this implementation is dully applicable to many other microprocessor controlled information processing systems.
Economic mitigation challenges: how further delay closes the door for achieving climate targets
NASA Astrophysics Data System (ADS)
Luderer, Gunnar; Pietzcker, Robert C.; Bertram, Christoph; Kriegler, Elmar; Meinshausen, Malte; Edenhofer, Ottmar
2013-09-01
While the international community aims to limit global warming to below 2 ° C to prevent dangerous climate change, little progress has been made towards a global climate agreement to implement the emissions reductions required to reach this target. We use an integrated energy-economy-climate modeling system to examine how a further delay of cooperative action and technology availability affect climate mitigation challenges. With comprehensive emissions reductions starting after 2015 and full technology availability we estimate that maximum 21st century warming may still be limited below 2 ° C with a likely probability and at moderate economic impacts. Achievable temperature targets rise by up to ˜0.4 ° C if the implementation of comprehensive climate policies is delayed by another 15 years, chiefly because of transitional economic impacts. If carbon capture and storage (CCS) is unavailable, the lower limit of achievable targets rises by up to ˜0.3 ° C. Our results show that progress in international climate negotiations within this decade is imperative to keep the 2 ° C target within reach.
Achieving Full Neurological Recovery in Snakebite using Best Supportive Care.
Wright, Sally; Haddock, Genevieve
2018-05-14
A 29-year-old woman presented to a community hospital in Sierra Leone 2 hours after being bitten by an unknown snake. On arrival, she was agitated though alert, however deteriorated into respiratory arrest. There was no local availability of antivenom. The patient remained in respiratory arrest undergoing best supportive care in a low-resource setting for 2 hours 55 minutes before returning to spontaneous ventilation. She went on to make a full neurological recovery. Though spontaneous recovery following snakebite envenoming is rare, this case showcases that good communication and basic manoeuvres can have a hugely positive impact on patient outcome. Alongside this, it highlights the need for staff and community engagement and implementation of local protocols in order to improve confidence and achieve consistent practice. © BMJ Publishing Group Ltd (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Chen, Weiliang; De Schutter, Erik
2017-01-01
Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation. PMID:28239346
Chen, Weiliang; De Schutter, Erik
2017-01-01
Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation.
Massanes, Francesc; Cadennes, Marie; Brankov, Jovan G.
2012-01-01
In this paper we describe and evaluate a fast implementation of a classical block matching motion estimation algorithm for multiple Graphical Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) computing engine. The implemented block matching algorithm (BMA) uses summed absolute difference (SAD) error criterion and full grid search (FS) for finding optimal block displacement. In this evaluation we compared the execution time of a GPU and CPU implementation for images of various sizes, using integer and non-integer search grids. The results show that use of a GPU card can shorten computation time by a factor of 200 times for integer and 1000 times for a non-integer search grid. The additional speedup for non-integer search grid comes from the fact that GPU has built-in hardware for image interpolation. Further, when using multiple GPU cards, the presented evaluation shows the importance of the data splitting method across multiple cards, but an almost linear speedup with a number of cards is achievable. In addition we compared execution time of the proposed FS GPU implementation with two existing, highly optimized non-full grid search CPU based motion estimations methods, namely implementation of the Pyramidal Lucas Kanade Optical flow algorithm in OpenCV and Simplified Unsymmetrical multi-Hexagon search in H.264/AVC standard. In these comparisons, FS GPU implementation still showed modest improvement even though the computational complexity of FS GPU implementation is substantially higher than non-FS CPU implementation. We also demonstrated that for an image sequence of 720×480 pixels in resolution, commonly used in video surveillance, the proposed GPU implementation is sufficiently fast for real-time motion estimation at 30 frames-per-second using two NVIDIA C1060 Tesla GPU cards. PMID:22347787
Massanes, Francesc; Cadennes, Marie; Brankov, Jovan G
2011-07-01
In this paper we describe and evaluate a fast implementation of a classical block matching motion estimation algorithm for multiple Graphical Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) computing engine. The implemented block matching algorithm (BMA) uses summed absolute difference (SAD) error criterion and full grid search (FS) for finding optimal block displacement. In this evaluation we compared the execution time of a GPU and CPU implementation for images of various sizes, using integer and non-integer search grids.The results show that use of a GPU card can shorten computation time by a factor of 200 times for integer and 1000 times for a non-integer search grid. The additional speedup for non-integer search grid comes from the fact that GPU has built-in hardware for image interpolation. Further, when using multiple GPU cards, the presented evaluation shows the importance of the data splitting method across multiple cards, but an almost linear speedup with a number of cards is achievable.In addition we compared execution time of the proposed FS GPU implementation with two existing, highly optimized non-full grid search CPU based motion estimations methods, namely implementation of the Pyramidal Lucas Kanade Optical flow algorithm in OpenCV and Simplified Unsymmetrical multi-Hexagon search in H.264/AVC standard. In these comparisons, FS GPU implementation still showed modest improvement even though the computational complexity of FS GPU implementation is substantially higher than non-FS CPU implementation.We also demonstrated that for an image sequence of 720×480 pixels in resolution, commonly used in video surveillance, the proposed GPU implementation is sufficiently fast for real-time motion estimation at 30 frames-per-second using two NVIDIA C1060 Tesla GPU cards.
Highly chirped single-bandpass microwave photonic filter with reconfiguration capabilities.
Bolea, Mario; Mora, José; Ortega, Beatriz; Capmany, José
2011-02-28
We propose a novel photonic structure to implement a chirped single-bandpass microwave photonic filter based on the amplitude modulation of a broadband optical signal transmitted by a non-linear dispersive element and an interferometric system prior to balanced photodetection. A full reconfigurability of the filter is achieved since amplitude and phase responses can be independently controlled. We have experimentally demonstrated chirp values up to tens of ns/GHz, which is, as far as we know, one order of magnitude better than others achieved by electrical approaches and furthermore, without restrictions in terms of frequency tuning since a frequency operation range up to 40 GHz has been experimentally demonstrated.
Progress in global measles control and mortality reduction, 2000-2007.
2008-12-05
Despite the availability of a safe and effective vaccine since 1963, measles has been a major killer of children in developing countries (causing an estimated 750,000 deaths as recently as 2000), primarily because of underutilization of the vaccine. At the World Health Assembly in 2008, all World Health Organization (WHO) member states reaffirmed their commitment to achieving a 90% reduction in measles mortality by 2010 compared with 2000, a goal that was established in 2005 as part of the Global Immunization Vision and Strategy (2). This WHO-UNICEF comprehensive strategy for measles mortality reduction (1) focuses on 47 priority countries. The strategy's components include 1) achieving and maintaining high coverage (>90%) with the routinely scheduled first dose of measles-containing vaccine (MCV1) among children aged 1 year; 2) ensuring that all children receive a second opportunity for measles immunization (either through a second routine dose or through periodic supplementary immunization activities [SIAs]); 3) implementing effective laboratory-supported disease surveillance; and 4) providing appropriate clinical management for measles cases. This report updates previously published reports and describes immunization and surveillance activities implemented during 2007. Increased routine measles vaccine coverage and SIAs implemented during 2000--2007 resulted in a 74% decrease in the estimated number of measles deaths globally. An estimated 197,000 deaths from measles occurred in 2007; of these, 136,000 (69%) occurred in the WHO South-East Asian Region. Achievement of the 2010 goal will require full implementation of measles mortality reduction strategies, especially in the WHO South-East Asian Region.
Full-color reflective cholesteric liquid crystal display
NASA Astrophysics Data System (ADS)
Huang, Xiao-Yang; Khan, Asad A.; Davis, Donald J.; Podojil, Gregg M.; Jones, Chad M.; Miller, Nick; Doane, J. William
1999-03-01
We report a full color 1/4 VGA reflective cholesteric display with 4096 colors. The display can deliver a brightness approaching 40 percent reflected luminance, far exceeding all other reflective technologies. With its zero voltage bistability, images can be stored for days and months without ny power consumption. This property can significantly extend the battery life. The capability of displaying full color complex graphics and images is a must in order to establish a market position in this multimedia age. Color is achieved by stacking RGB cells. The top layer is blue with right chirality, the middle layer is green with left chirality, and the bottom layer is red with right chirality. The choice of opposite chirality prevents the loss in the green and red spectra from the blue layer on the top. We also adjusted the thickness of each layer to achieve color balance. We implement gray scale in each layer with pulse width modulation. This modulation method is the best choice consideration of lower driver cost, simpler structure with fewer cross talk problems. Various drive schemes and modulation methods will be discussed in the conference.
NASA Astrophysics Data System (ADS)
Leysath, Maggie
This exploratory phenomenological case study investigated the influence the full integration of the arts into core subject instruction has on classroom environment, student academic achievement, and student engagement as perceived by administrators, teachers, and students in one East Texas secondary school. Participant interviews were analyzed using Creswell's (2012) six-step method for analyzing phenomenological studies. The researcher implemented three learning activities in which ceramics learning objectives were fully integrated with chemistry learning objectives. The first activity combined clay properties and pottery wheel throwing with significant numbers. The second activity combined glaze formulation with moles. The third combined stoichiometry with the increased glaze formula for students to glaze the bowls they made. Findings suggest the full integration of art in core subject area instruction has numerous positive effects. Participants reported improved academic achievement for all students including reluctant learners. Students, teachers, and the administrator reported greater participation in the art integrated activities. Participants perceived a need for further training for teachers and administrators for greater success.
Design and Implementation of a New Real-Time Frequency Sensor Used as Hardware Countermeasure
Jiménez-Naharro, Raúl; Gómez-Galán, Juan Antonio; Sánchez-Raya, Manuel; Gómez-Bravo, Fernando; Pedro-Carrasco, Manuel
2013-01-01
A new digital countermeasure against attacks related to the clock frequency is –presented. This countermeasure, known as frequency sensor, consists of a local oscillator, a transition detector, a measurement element and an output block. The countermeasure has been designed using a full-custom technique implemented in an Application-Specific Integrated Circuit (ASIC), and the implementation has been verified and characterized with an integrated design using a 0.35 μm standard Complementary Metal Oxide Semiconductor (CMOS) technology (Very Large Scale Implementation—VLSI implementation). The proposed solution is configurable in resolution time and allowed range of period, achieving a minimum resolution time of only 1.91 ns and an initialization time of 5.84 ns. The proposed VLSI implementation shows better results than other solutions, such as digital ones based on semi-custom techniques and analog ones based on band pass filters, all design parameters considered. Finally, a counter has been used to verify the good performance of the countermeasure in avoiding the success of an attack. PMID:24008285
Doshmangir, Leila; Rashidian, Arash; Ravaghi, Hamid; Takian, Amirhossein; Jafari, Mehdi
2015-01-01
Background: In 2004, the health system in Iran initiated an organizational reform aiming to increase the autonomy of teaching hospitals and make them more decentralized. The policy led to the formation of a board of trustees in each hospital and significant modifications in hospitals’ financing. Since the reform aimed to improve its predecessor policy (implementation of hospital autonomy began in 1995), it expected to increase user satisfaction, as well as enhance effectiveness and efficiency of healthcare services in targeted hospitals. However, such expectations were never realized. In this research, we explored the perceptions and views of expert stakeholders as to why the board of trustees’ policy did not achieve its perceived objectives. Methods: We conducted 47 semi-structured face-to-face interviews and two focus group discussions (involving 8 and 10 participants, respectively) with experts at high, middle, and low levels of Iran’s health system, using purposive and snowball sampling. We also collected a comprehensive set of relevant documents. Interviews were transcribed verbatim and analyzed thematically, following a mixed inductive-deductive approach. Results: Three main themes emerged from the analysis. The implementation approach (including the processes, views about the policy and the links between the policy components), using research evidence about the policy (local and global), and policy context (health system structure, health insurers capacity, hospitals’ organization and capacity and actors’ interrelationships) affected the policy outcomes. Overall, the implementation of hospital decentralization policies in Iran did not seem to achieve their intended targets as a result of assumed failure to take full consideration of the above factors in policy implementation into account. Conclusion: The implementation of the board of trustees’ policy did not achieve its desired goals in teaching hospitals in Iran. Similar decentralization policies in the past and their outcomes were overlooked, while the context was not prepared appropriately and key stakeholders, particularly the government, did not support the decentralization of Iran’s health system. PMID:25844379
Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators
Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew
2014-01-01
Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950
2013-01-01
implementing several internal monthly controls testing initiatives, and other similar accomplishments geared to achieving a full audit-ready financial report...existence and completeness of assets, internal controls, and other critical functions required to meet audit readiness goals. The Army is on-track...ensure the integrity of their reporting systems, programs, and operations. This section focuses on the Army’s system of internal controls to
Robust media processing on programmable power-constrained systems
NASA Astrophysics Data System (ADS)
McVeigh, Jeff
2005-03-01
To achieve consumer-level quality, media systems must process continuous streams of audio and video data while maintaining exacting tolerances on sampling rate, jitter, synchronization, and latency. While it is relatively straightforward to design fixed-function hardware implementations to satisfy worst-case conditions, there is a growing trend to utilize programmable multi-tasking solutions for media applications. The flexibility of these systems enables support for multiple current and future media formats, which can reduce design costs and time-to-market. This paper provides practical engineering solutions to achieve robust media processing on such systems, with specific attention given to power-constrained platforms. The techniques covered in this article utilize the fundamental concepts of algorithm and software optimization, software/hardware partitioning, stream buffering, hierarchical prioritization, and system resource and power management. A novel enhancement to dynamically adjust processor voltage and frequency based on buffer fullness to reduce system power consumption is examined in detail. The application of these techniques is provided in a case study of a portable video player implementation based on a general-purpose processor running a non real-time operating system that achieves robust playback of synchronized H.264 video and MP3 audio from local storage and streaming over 802.11.
Improved accuracy for finite element structural analysis via an integrated force method
NASA Technical Reports Server (NTRS)
Patnaik, S. N.; Hopkins, D. A.; Aiello, R. A.; Berke, L.
1992-01-01
A comparative study was carried out to determine the accuracy of finite element analyses based on the stiffness method, a mixed method, and the new integrated force and dual integrated force methods. The numerical results were obtained with the following software: MSC/NASTRAN and ASKA for the stiffness method; an MHOST implementation method for the mixed method; and GIFT for the integrated force methods. The results indicate that on an overall basis, the stiffness and mixed methods present some limitations. The stiffness method generally requires a large number of elements in the model to achieve acceptable accuracy. The MHOST method tends to achieve a higher degree of accuracy for course models than does the stiffness method implemented by MSC/NASTRAN and ASKA. The two integrated force methods, which bestow simultaneous emphasis on stress equilibrium and strain compatibility, yield accurate solutions with fewer elements in a model. The full potential of these new integrated force methods remains largely unexploited, and they hold the promise of spawning new finite element structural analysis tools.
Implementation of a multi-threaded framework for large-scale scientific applications
Sexton-Kennedy, E.; Gartung, Patrick; Jones, C. D.; ...
2015-05-22
The CMS experiment has recently completed the development of a multi-threaded capable application framework. In this paper, we will discuss the design, implementation and application of this framework to production applications in CMS. For the 2015 LHC run, this functionality is particularly critical for both our online and offline production applications, which depend on faster turn-around times and a reduced memory footprint relative to before. These applications are complex codes, each including a large number of physics-driven algorithms. While the framework is capable of running a mix of thread-safe and 'legacy' modules, algorithms running in our production applications need tomore » be thread-safe for optimal use of this multi-threaded framework at a large scale. Towards this end, we discuss the types of changes, which were necessary for our algorithms to achieve good performance of our multithreaded applications in a full-scale application. Lastly performance numbers for what has been achieved for the 2015 run are presented.« less
Feliciano, Diana; Hunter, Colin; Slee, Bill; Smith, Pete
2013-05-15
The Climate Change (Scotland) Act 2009 commits Scotland to reduce GHG emissions by at least 42% by 2020 and 80% by 2050, from 1990 levels. According to the Climate Change Delivery Plan, the desired emission reduction for the rural land use sector (agriculture and other land uses) is 21% compared to 1990, or 10% compared to 2006 levels. In 2006, in North East Scotland, gross greenhouse gas (GHG) emissions from rural land uses were about 1599 ktCO2e. Thus, to achieve a 10% reduction in 2020 relative to 2006, emissions would have to decrease to about 1440 ktCO2e. This study developed a methodology to help selecting land-based practices to mitigate GHG emissions at the regional level. The main criterion used was the "full" mitigation potential of each practice. A mix of methods was used to undertake this study, namely a literature review and quantitative estimates. The mitigation practice that offered greatest "full" mitigation potential (≈66% reduction by 2020 relative to 2006) was woodland planting with Sitka spruce. Several barriers, such as economic, social, political and institutional, affect the uptake of mitigation practices in the region. Consequently the achieved mitigation potential of a practice may be lower than its "full" mitigation potential. Surveys and focus groups, with relevant stakeholders, need to be undertaken to assess the real area where mitigation practices can be implemented and the best way to overcome the barriers for their implementation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Emulating short-term synaptic dynamics with memristive devices
NASA Astrophysics Data System (ADS)
Berdan, Radu; Vasilaki, Eleni; Khiat, Ali; Indiveri, Giacomo; Serb, Alexandru; Prodromakis, Themistoklis
2016-01-01
Neuromorphic architectures offer great promise for achieving computation capacities beyond conventional Von Neumann machines. The essential elements for achieving this vision are highly scalable synaptic mimics that do not undermine biological fidelity. Here we demonstrate that single solid-state TiO2 memristors can exhibit non-associative plasticity phenomena observed in biological synapses, supported by their metastable memory state transition properties. We show that, contrary to conventional uses of solid-state memory, the existence of rate-limiting volatility is a key feature for capturing short-term synaptic dynamics. We also show how the temporal dynamics of our prototypes can be exploited to implement spatio-temporal computation, demonstrating the memristors full potential for building biophysically realistic neural processing systems.
Progress report on a multi-service family planning mobile unit September, 1981.
1981-12-01
In 1979, the National Family Planning Program's (NFPP) multiservice mobile unit pilot project was implemented to deliver a full complement of clinical and nonclinical family planning services to remote Thai villages by transporting nurses, physicians, and supplies by van. 15 provinces with the lowest family planning achievement in 1978 were selected to participate in the project for 1 year; one refused. Funding was allocated for mobile unit trips and promotional billboards. Implementation at the time of data analysis averaged 9.8 province-months, sufficient to reveal trends in project achievement. 9579 new acceptors were reported after 805 mobile trips in the 14 provinces, an average of 12 new acceptors/trip. New acceptor recruitment costs were estimated at $6.20/client. Based on Thai data for continuation rates, an estimated 18,238 couples years of protection (CYP) were achieved by the mobile unit. In comparison to other family planning services' mobile units, the multiservice unit had the lowest operating costs, but the most expensive cost/CYP. The effectiveness of the promotional billboards was assessed by comparing acceptor rates in provinces with and without billboards. Overall, the provinces with billboards showed less of an increase in new acceptors. When months of project implementation are controlled, a positive effect of the billboards is suggested. While demonstrating that all modern contraception can be delivered via mobile units to remote villages, there is inadequate acceptance of the highly effective family planning methods to justify the cost of transporting staff and equipment.
How to keep the Grid full and working with ATLAS production and physics jobs
NASA Astrophysics Data System (ADS)
Pacheco Pagés, A.; Barreiro Megino, F. H.; Cameron, D.; Fassi, F.; Filipcic, A.; Di Girolamo, A.; González de la Hoz, S.; Glushkov, I.; Maeno, T.; Walker, R.; Yang, W.; ATLAS Collaboration
2017-10-01
The ATLAS production system provides the infrastructure to process millions of events collected during the LHC Run 1 and the first two years of Run 2 using grid, clouds and high performance computing. We address in this contribution the strategies and improvements that have been implemented to the production system for optimal performance and to achieve the highest efficiency of available resources from operational perspective. We focus on the recent developments.
Structural Integrity of an Electron Beam Melted Titanium Alloy.
Lancaster, Robert; Davies, Gareth; Illsley, Henry; Jeffs, Spencer; Baxter, Gavin
2016-06-14
Advanced manufacturing encompasses the wide range of processes that consist of "3D printing" of metallic materials. One such method is Electron Beam Melting (EBM), a modern build technology that offers significant potential for lean manufacture and a capability to produce fully dense near-net shaped components. However, the manufacture of intricate geometries will result in variable thermal cycles and thus a transient microstructure throughout, leading to a highly textured structure. As such, successful implementation of these technologies requires a comprehensive assessment of the relationships of the key process variables, geometries, resultant microstructures and mechanical properties. The nature of this process suggests that it is often difficult to produce representative test specimens necessary to achieve a full mechanical property characterisation. Therefore, the use of small scale test techniques may be exploited, specifically the small punch (SP) test. The SP test offers a capability for sampling miniaturised test specimens from various discrete locations in a thin-walled component, allowing a full characterisation across a complex geometry. This paper provides support in working towards development and validation strategies in order for advanced manufactured components to be safely implemented into future gas turbine applications. This has been achieved by applying the SP test to a series of Ti-6Al-4V variants that have been manufactured through a variety of processing routes including EBM and investigating the structural integrity of each material and how this controls the mechanical response.
Overcoming multicollinearity in multiple regression using correlation coefficient
NASA Astrophysics Data System (ADS)
Zainodin, H. J.; Yap, S. J.
2013-09-01
Multicollinearity happens when there are high correlations among independent variables. In this case, it would be difficult to distinguish between the contributions of these independent variables to that of the dependent variable as they may compete to explain much of the similar variance. Besides, the problem of multicollinearity also violates the assumption of multiple regression: that there is no collinearity among the possible independent variables. Thus, an alternative approach is introduced in overcoming the multicollinearity problem in achieving a well represented model eventually. This approach is accomplished by removing the multicollinearity source variables on the basis of the correlation coefficient values based on full correlation matrix. Using the full correlation matrix can facilitate the implementation of Excel function in removing the multicollinearity source variables. It is found that this procedure is easier and time-saving especially when dealing with greater number of independent variables in a model and a large number of all possible models. Hence, in this paper detailed insight of the procedure is shown, compared and implemented.
Ffuzz: Towards full system high coverage fuzz testing on binary executables.
Zhang, Bin; Ye, Jiaxi; Bi, Xing; Feng, Chao; Tang, Chaojing
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool-Ffuzz-on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently.
NASA Astrophysics Data System (ADS)
Pioldi, Fabio; Rizzi, Egidio
2016-08-01
This paper proposes a new output-only element-level system identification and input estimation technique, towards the simultaneous identification of modal parameters, input excitation time history and structural features at the element-level by adopting earthquake-induced structural response signals. The method, named Full Dynamic Compound Inverse Method (FDCIM), releases strong assumptions of earlier element-level techniques, by working with a two-stage iterative algorithm. Jointly, a Statistical Average technique, a modification process and a parameter projection strategy are adopted at each stage to achieve stronger convergence for the identified estimates. The proposed method works in a deterministic way and is completely developed in State-Space form. Further, it does not require continuous- to discrete-time transformations and does not depend on initialization conditions. Synthetic earthquake-induced response signals from different shear-type buildings are generated to validate the implemented procedure, also with noise-corrupted cases. The achieved results provide a necessary condition to demonstrate the effectiveness of the proposed identification method.
Full image-processing pipeline in field-programmable gate array for a small endoscopic camera
NASA Astrophysics Data System (ADS)
Mostafa, Sheikh Shanawaz; Sousa, L. Natércia; Ferreira, Nuno Fábio; Sousa, Ricardo M.; Santos, Joao; Wäny, Martin; Morgado-Dias, F.
2017-01-01
Endoscopy is an imaging procedure used for diagnosis as well as for some surgical purposes. The camera used for the endoscopy should be small and able to produce a good quality image or video, to reduce discomfort of the patients, and to increase the efficiency of the medical team. To achieve these fundamental goals, a small endoscopy camera with a footprint of 1 mm×1 mm×1.65 mm is used. Due to the physical properties of the sensors and human vision system limitations, different image-processing algorithms, such as noise reduction, demosaicking, and gamma correction, among others, are needed to faithfully reproduce the image or video. A full image-processing pipeline is implemented using a field-programmable gate array (FPGA) to accomplish a high frame rate of 60 fps with minimum processing delay. Along with this, a viewer has also been developed to display and control the image-processing pipeline. The control and data transfer are done by a USB 3.0 end point in the computer. The full developed system achieves real-time processing of the image and fits in a Xilinx Spartan-6LX150 FPGA.
Bian, Wei; Zhang, Shuyan; Zhang, Yanzhuo; Li, Wenjing; Kan, Ruizhe; Wang, Wenxiao; Zheng, Zhaoming; Li, Jun
2017-02-01
A ratio control strategy was implemented in a continuous moving bed biofilm reactor (MBBR) to investigate the response to different temperatures. The control strategy was designed to maintain a constant ratio between dissolved oxygen (DO) and total ammonia nitrogen (TAN) concentrations. The results revealed that a stable nitritation in a biofilm reactor could be achieved via ratio control, which compensated the negative influence of low temperatures by stronger oxygen-limiting conditions. Even with a temperature as low as 6°C, stable nitritation could be achieved when the controlling ratio did not exceed 0.17. Oxygen-limiting conditions in the biofilm reactor were determined by the DO/TAN concentrations ratio, instead of the mere DO concentration. This ratio control strategy allowed the achievement of stable nitritation without complete wash-out of NOB from the reactor. Through the ratio control strategy full nitritation of sidestream wastewater was allowed; however, for mainstream wastewater, only partial nitritation was recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.
Multispectral histogram normalization contrast enhancement
NASA Technical Reports Server (NTRS)
Soha, J. M.; Schwartz, A. A.
1979-01-01
A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.
Frapid: achieving full automation of FRAP for chemical probe validation
Yapp, Clarence; Rogers, Catherine; Savitsky, Pavel; Philpott, Martin; Müller, Susanne
2016-01-01
Fluorescence Recovery After Photobleaching (FRAP) is an established method for validating chemical probes against the chromatin reading bromodomains, but so far requires constant human supervision. Here, we present Frapid, an automated open source code implementation of FRAP that fully handles cell identification through fuzzy logic analysis, drug dispensing with a custom-built fluid handler, image acquisition & analysis, and reporting. We successfully tested Frapid on 3 bromodomains as well as on spindlin1 (SPIN1), a methyl lysine binder, for the first time. PMID:26977352
Challenges associated with the implementation of the nursing process: A systematic review.
Zamanzadeh, Vahid; Valizadeh, Leila; Tabrizi, Faranak Jabbarzadeh; Behshid, Mojghan; Lotfi, Mojghan
2015-01-01
Nursing process is a scientific approach in the provision of qualified nursing cares. However, in practice, the implementation of this process is faced with numerous challenges. With the knowledge of the challenges associated with the implementation of the nursing process, the nursing processes can be developed appropriately. Due to the lack of comprehensive information on this subject, the current study was carried out to assess the key challenges associated with the implementation of the nursing process. To achieve and review related studies on this field, databases of Iran medix, SID, Magiran, PUBMED, Google scholar, and Proquest were assessed using the main keywords of nursing process and nursing process systematic review. The articles were retrieved in three steps including searching by keywords, review of the proceedings based on inclusion criteria, and final retrieval and assessment of available full texts. Systematic assessment of the articles showed different challenges in implementation of the nursing process. Intangible understanding of the concept of nursing process, different views of the process, lack of knowledge and awareness among nurses related to the execution of process, supports of managing systems, and problems related to recording the nursing process were the main challenges that were extracted from review of literature. On systematically reviewing the literature, intangible understanding of the concept of nursing process has been identified as the main challenge in nursing process. To achieve the best strategy to minimize the challenge, in addition to preparing facilitators for implementation of nursing process, intangible understanding of the concept of nursing process, different views of the process, and forming teams of experts in nursing education are recommended for internalizing the nursing process among nurses.
Challenges associated with the implementation of the nursing process: A systematic review
Zamanzadeh, Vahid; Valizadeh, Leila; Tabrizi, Faranak Jabbarzadeh; Behshid, Mojghan; Lotfi, Mojghan
2015-01-01
Background: Nursing process is a scientific approach in the provision of qualified nursing cares. However, in practice, the implementation of this process is faced with numerous challenges. With the knowledge of the challenges associated with the implementation of the nursing process, the nursing processes can be developed appropriately. Due to the lack of comprehensive information on this subject, the current study was carried out to assess the key challenges associated with the implementation of the nursing process. Materials and Methods: To achieve and review related studies on this field, databases of Iran medix, SID, Magiran, PUBMED, Google scholar, and Proquest were assessed using the main keywords of nursing process and nursing process systematic review. The articles were retrieved in three steps including searching by keywords, review of the proceedings based on inclusion criteria, and final retrieval and assessment of available full texts. Results: Systematic assessment of the articles showed different challenges in implementation of the nursing process. Intangible understanding of the concept of nursing process, different views of the process, lack of knowledge and awareness among nurses related to the execution of process, supports of managing systems, and problems related to recording the nursing process were the main challenges that were extracted from review of literature. Conclusions: On systematically reviewing the literature, intangible understanding of the concept of nursing process has been identified as the main challenge in nursing process. To achieve the best strategy to minimize the challenge, in addition to preparing facilitators for implementation of nursing process, intangible understanding of the concept of nursing process, different views of the process, and forming teams of experts in nursing education are recommended for internalizing the nursing process among nurses. PMID:26257793
ERIC Educational Resources Information Center
Hardinger, Regina Gail
2013-01-01
Many educational administrators in Georgia continue to struggle with low student academic achievement and low high school graduation rates. DuFour's professional learning community (PLC) theory suggests a positive relationship between levels of PLC implementation and academic achievement and between levels of PLC implementation and graduation…
Improved accuracy for finite element structural analysis via a new integrated force method
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Hopkins, Dale A.; Aiello, Robert A.; Berke, Laszlo
1992-01-01
A comparative study was carried out to determine the accuracy of finite element analyses based on the stiffness method, a mixed method, and the new integrated force and dual integrated force methods. The numerical results were obtained with the following software: MSC/NASTRAN and ASKA for the stiffness method; an MHOST implementation method for the mixed method; and GIFT for the integrated force methods. The results indicate that on an overall basis, the stiffness and mixed methods present some limitations. The stiffness method generally requires a large number of elements in the model to achieve acceptable accuracy. The MHOST method tends to achieve a higher degree of accuracy for course models than does the stiffness method implemented by MSC/NASTRAN and ASKA. The two integrated force methods, which bestow simultaneous emphasis on stress equilibrium and strain compatibility, yield accurate solutions with fewer elements in a model. The full potential of these new integrated force methods remains largely unexploited, and they hold the promise of spawning new finite element structural analysis tools.
Implementation of Comprehensive School Reform and Its Impact on Increases in Student Achievement
ERIC Educational Resources Information Center
Zhang, Yu; Fashola, Olatokunbo; Shkolnik, Jamie; Boyle, Andrea
2006-01-01
This study examined the relationship between the implementation of comprehensive school reform (CSR) and changes in reading and math achievement from 1999 until 2003. Survey data about CSR implementation and school-level achievement data were collected for multiple years from a sample of CSR schools and compared with a sample of matched comparison…
Structural Integrity of an Electron Beam Melted Titanium Alloy
Lancaster, Robert; Davies, Gareth; Illsley, Henry; Jeffs, Spencer; Baxter, Gavin
2016-01-01
Advanced manufacturing encompasses the wide range of processes that consist of “3D printing” of metallic materials. One such method is Electron Beam Melting (EBM), a modern build technology that offers significant potential for lean manufacture and a capability to produce fully dense near-net shaped components. However, the manufacture of intricate geometries will result in variable thermal cycles and thus a transient microstructure throughout, leading to a highly textured structure. As such, successful implementation of these technologies requires a comprehensive assessment of the relationships of the key process variables, geometries, resultant microstructures and mechanical properties. The nature of this process suggests that it is often difficult to produce representative test specimens necessary to achieve a full mechanical property characterisation. Therefore, the use of small scale test techniques may be exploited, specifically the small punch (SP) test. The SP test offers a capability for sampling miniaturised test specimens from various discrete locations in a thin-walled component, allowing a full characterisation across a complex geometry. This paper provides support in working towards development and validation strategies in order for advanced manufactured components to be safely implemented into future gas turbine applications. This has been achieved by applying the SP test to a series of Ti-6Al-4V variants that have been manufactured through a variety of processing routes including EBM and investigating the structural integrity of each material and how this controls the mechanical response. PMID:28773590
Savvas, Steven; Toye, Christine; Beattie, Elizabeth; Gibson, Stephen J
2014-12-01
Pain is common in residential aged care facilities (RACFs). In 2005, the Australian Pain Society developed 27 recommendations for good practice in the identification, assessment, and management of pain in these settings. This study aimed to address implementation of the standards and evaluate outcomes. Five facilities in Australia participated in a comprehensive evaluation of RACF pain practice and outcomes. Pre-existing pain management practices were compared with the 27 recommendations, before an evidence-based pain management program was introduced that included training and education for staff and revised in-house pain-management procedures. Post-implementation audits evaluated the program's success. Aged care staff teams also were assessed on their reports of self-efficacy in pain management. The results show that before the implementation program, the RACFs demonstrated full compliance on 6 to 12 standards. By the project's completion, RACFs demonstrated full compliance with 10 to 23 standards and major improvements toward compliance in the remaining standards. After implementation, the staff also reported better understanding of the standards (p < .001) or of facility pain management guidelines (p < .001), increased confidence in therapies for pain management (p < .001), and increased confidence in their training to assess pain (p < .001) and recognize pain in residents with dementia who are nonverbal (p = .003). The results show that improved evidence-based practice in RACFs can be achieved with appropriate training and education. Investing resources in the aged care workforce via this implementation program has shown improvements in staff self-efficacy and practice. Copyright © 2014 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Stehman, S.V.; Wickham, J.D.; Wade, T.G.; Smith, J.H.
2008-01-01
The database design and diverse application of NLCD 2001 pose significant challenges for accuracy assessment because numerous objectives are of interest, including accuracy of land-cover, percent urban imperviousness, percent tree canopy, land-cover composition, and net change. A multi-support approach is needed because these objectives require spatial units of different sizes for reference data collection and analysis. Determining a sampling design that meets the full suite of desirable objectives for the NLCD 2001 accuracy assessment requires reconciling potentially conflicting design features that arise from targeting the different objectives. Multi-stage cluster sampling provides the general structure to achieve a multi-support assessment, and the flexibility to target different objectives at different stages of the design. We describe the implementation of two-stage cluster sampling for the initial phase of the NLCD 2001 assessment, and identify gaps in existing knowledge where research is needed to allow full implementation of a multi-objective, multi-support assessment. ?? 2008 American Society for Photogrammetry and Remote Sensing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
TEDESCHI AR; CORBETT JE; WILSON RA
2012-01-26
Simulant testing of a full-scale thin-film evaporator system was conducted in 2011 for technology development at the Hanford tank farms. Test results met objectives of water removal rate, effluent quality, and operational evaluation. Dilute tank waste simulant, representing a typical double-shell tank supernatant liquid layer, was concentrated from a 1.1 specific gravity to approximately 1.5 using a 4.6 m{sup 2} (50 ft{sup 2}) heated transfer area Rototherm{reg_sign} evaporator from Artisan Industries. The condensed evaporator vapor stream was collected and sampled validating efficient separation of the water. An overall decontamination factor of 1.2E+06 was achieved demonstrating excellent retention of key radioactivemore » species within the concentrated liquid stream. The evaporator system was supported by a modular steam supply, chiller, and control computer systems which would be typically implemented at the tank farms. Operation of these support systems demonstrated successful integration while identifying areas for efficiency improvement. Overall testing effort increased the maturation of this technology to support final deployment design and continued project implementation.« less
Development of a Two-Wheel Contingency Mode for the MAP Spacecraft
NASA Technical Reports Server (NTRS)
Starin, Scott R.; ODonnell, James R., Jr.; Bauer, Frank H. (Technical Monitor)
2002-01-01
In the event of a failure of one of MAP's three reaction wheel assemblies (RWAs), it is not possible to achieve three-axis, full-state attitude control using the remaining two wheels. Hence, two of the attitude control algorithms implemented on the MAP spacecraft will no longer be usable in their current forms: Inertial Mode, used for slewing to and holding inertial attitudes, and Observing Mode, which implements the nominal dual-spin science mode. This paper describes the effort to create a complete strategy for using software algorithms to cope with a RWA failure. The discussion of the design process will be divided into three main subtopics: performing orbit maneuvers to reach and maintain an orbit about the second Earth-Sun libration point in the event of a RWA failure, completing the mission using a momentum-bias two-wheel science mode, and developing a new thruster-based mode for adjusting the inertially fixed momentum bias. In this summary, the philosophies used in designing these changes is shown; the full paper will supplement these with algorithm descriptions and testing results.
NASA Astrophysics Data System (ADS)
Rais, Muhammad H.
2010-06-01
This paper presents Field Programmable Gate Array (FPGA) implementation of standard and truncated multipliers using Very High Speed Integrated Circuit Hardware Description Language (VHDL). Truncated multiplier is a good candidate for digital signal processing (DSP) applications such as finite impulse response (FIR) and discrete cosine transform (DCT). Remarkable reduction in FPGA resources, delay, and power can be achieved using truncated multipliers instead of standard parallel multipliers when the full precision of the standard multiplier is not required. The truncated multipliers show significant improvement as compared to standard multipliers. Results show that the anomaly in Spartan-3 AN average connection and maximum pin delay have been efficiently reduced in Virtex-4 device.
An inter-professional approach to personalized medicine education: one institution's experience.
Formea, Christine M; Nicholson, Wayne T; Vitek, Carolyn Rohrer
2015-03-01
Personalized medicine offers the promise of better diagnoses, targeted therapies and individualized treatment plans. Pharmacogenomics is an integral component of personalized medicine; it aids in the prediction of an individual's response to medications. Despite growing public acceptance and emerging clinical evidence, this rapidly expanding field of medicine is slow to be adopted and utilized by healthcare providers, although many believe that they should be knowledgeable and able to apply pharmacogenomics in clinical practice. Institutional infrastructure must be built to support pharmacogenomic implementation. Multidisciplinary education for healthcare providers is a critical component for pharmacogenomics to achieve its full potential to optimize patient care. We describe our recent experience at the Mayo Clinic implementing pharmacogenomics education in a large, academic healthcare system facilitated by the Mayo Clinic Center for Individualized Medicine.
NASA Astrophysics Data System (ADS)
Castro, Víctor M.; Muñoz, Nestor A.; Salazar, Antonio J.
2015-01-01
Auscultation is one of the most utilized physical examination procedures for listening to lung, heart and intestinal sounds during routine consults and emergencies. Heart and lung sounds overlap in the thorax. An algorithm was used to separate them based on the discrete wavelet transform with multi-resolution analysis, which decomposes the signal into approximations and details. The algorithm was implemented in software and in hardware to achieve real-time signal separation. The heart signal was found in detail eight and the lung signal in approximation six. The hardware was used to separate the signals with a delay of 256 ms. Sending wavelet decomposition data - instead of the separated full signa - allows telemedicine applications to function in real time over low-bandwidth communication channels.
Steketee, Majone; Oesterle, Sabrina; Jonkman, Harrie; Hawkins, J. David; Haggerty, Kevin P.; Aussems, Claire
2013-01-01
Josine Junger-Tas introduced the Communities That Care (CTC) prevention system to the Netherlands as a promising approach to address the growing youth violence and delinquency. Using data from a randomized trial of CTC in the United States and a quasi-experimental study of CTC in the Netherlands, this article describes the results of a comparison of the implementation of CTC in 12 U.S. communities and 5 Dutch neighborhoods. CTC communities in both countries achieved higher stages of a science-based approach to prevention than control communities, but full implementation of CTC in the Netherlands was hampered by the very small list of prevention programs tested and found effective in the Dutch context. PMID:24465089
Gerard, Baudouin; Duvall, Jeremy R.; Lowe, Jason T.; Murillo, Tiffanie; Wei, Jingqiang; Akella, Lakshmi B.; Marcaurelle, Lisa A.
2011-01-01
We have implemented an aldol-based ‘build/couple/pair’ (B/C/P) strategy for the synthesis of stereochemically diverse 8-membered lactam and sultam scaffolds via SNAr cycloetherification. Each scaffold contains two handles, an amine and aryl bromide, for solid-phase diversification via N-capping and Pd-mediated cross coupling. A sparse matrix design strategy that achieves the dual objective of controlling physicochemical properties and selecting diverse library members was implemented. The production of two 8000-membered libraries is discussed including a full analysis of library purity and property distribution. Library diversity was evaluated in comparison to the Molecular Library Small Molecule Repository (MLSMR) through the use of a multi-fusion similarity (MFS) map and principal component analysis (PCA). PMID:21526820
Design and Fabrication of Full Wheatstone-Bridge-Based Angular GMR Sensors.
Yan, Shaohua; Cao, Zhiqiang; Guo, Zongxia; Zheng, Zhenyi; Cao, Anni; Qi, Yue; Leng, Qunwen; Zhao, Weisheng
2018-06-05
Since the discovery of the giant magnetoresistive (GMR) effect, GMR sensors have gained much attention in last decades due to their high sensitivity, small size, and low cost. The full Wheatstone-bridge-based GMR sensor is most useful in terms of the application point of view. However, its manufacturing process is usually complex. In this paper, we present an efficient and concise approach to fabricate a full Wheatstone-bridge-based angular GMR sensor by depositing one GMR film stack, utilizing simple patterned processes, and a concise post-annealing procedure based on a special layout. The angular GMR sensor is of good linear performance and achieves a sensitivity of 0.112 mV/V/Oe at the annealing temperature of 260 °C in the magnetic field range from -50 to +50 Oe. This work provides a design and method for GMR-sensor manufacturing that is easy for implementation and suitable for mass production.
Technical Assessment of the National Full Scale Aerodynamic Complex Fan Blades Repair
NASA Technical Reports Server (NTRS)
Young, Clarence P., Jr.; Dixon, Peter G.; St.Clair, Terry L.; Johns, William E.
1998-01-01
This report describes the principal activities of a technical review team formed to address National Full Scale Aerodynamic Complex (NFAC) blade repair problems. In particular, the problem of lack of good adhesive bonding of the composite overwrap to the Hyduliginum wood blade material was studied extensively. Description of action plans and technical elements of the plans are provided. Results of experiments designed to optimize the bonding process and bonding strengths obtained on a full scale blade using a two-step cure process with adhesive primers are presented. Consensus recommendations developed by the review team in conjunction with the NASA Ames Fan Blade Repair Project Team are provided along with lessons learned on this program. Implementation of recommendations resulted in achieving good adhesive bonds between the composite materials and wooden blades, thereby providing assurance that the repaired fan blades will meet or exceed operational life requirements.
Area/latency optimized early output asynchronous full adders and relative-timed ripple carry adders.
Balasubramanian, P; Yamashita, S
2016-01-01
This article presents two area/latency optimized gate level asynchronous full adder designs which correspond to early output logic. The proposed full adders are constructed using the delay-insensitive dual-rail code and adhere to the four-phase return-to-zero handshaking. For an asynchronous ripple carry adder (RCA) constructed using the proposed early output full adders, the relative-timing assumption becomes necessary and the inherent advantages of the relative-timed RCA are: (1) computation with valid inputs, i.e., forward latency is data-dependent, and (2) computation with spacer inputs involves a bare minimum constant reverse latency of just one full adder delay, thus resulting in the optimal cycle time. With respect to different 32-bit RCA implementations, and in comparison with the optimized strong-indication, weak-indication, and early output full adder designs, one of the proposed early output full adders achieves respective reductions in latency by 67.8, 12.3 and 6.1 %, while the other proposed early output full adder achieves corresponding reductions in area by 32.6, 24.6 and 6.9 %, with practically no power penalty. Further, the proposed early output full adders based asynchronous RCAs enable minimum reductions in cycle time by 83.4, 15, and 8.8 % when considering carry-propagation over the entire RCA width of 32-bits, and maximum reductions in cycle time by 97.5, 27.4, and 22.4 % for the consideration of a typical carry chain length of 4 full adder stages, when compared to the least of the cycle time estimates of various strong-indication, weak-indication, and early output asynchronous RCAs of similar size. All the asynchronous full adders and RCAs were realized using standard cells in a semi-custom design fashion based on a 32/28 nm CMOS process technology.
Ffuzz: Towards full system high coverage fuzz testing on binary executables
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool—Ffuzz—on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently. PMID:29791469
Autonomous smart sensor network for full-scale structural health monitoring
NASA Astrophysics Data System (ADS)
Rice, Jennifer A.; Mechitov, Kirill A.; Spencer, B. F., Jr.; Agha, Gul A.
2010-04-01
The demands of aging infrastructure require effective methods for structural monitoring and maintenance. Wireless smart sensor networks offer the ability to enhance structural health monitoring (SHM) practices through the utilization of onboard computation to achieve distributed data management. Such an approach is scalable to the large number of sensor nodes required for high-fidelity modal analysis and damage detection. While smart sensor technology is not new, the number of full-scale SHM applications has been limited. This slow progress is due, in part, to the complex network management issues that arise when moving from a laboratory setting to a full-scale monitoring implementation. This paper presents flexible network management software that enables continuous and autonomous operation of wireless smart sensor networks for full-scale SHM applications. The software components combine sleep/wake cycling for enhanced power management with threshold detection for triggering network wide tasks, such as synchronized sensing or decentralized modal analysis, during periods of critical structural response.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindley, Benjamin A.; Parks, Geoffrey T.; Franceschini, Fausto
Multiple recycle of long-lived actinides has the potential to greatly reduce the required storage time for spent nuclear fuel or high level nuclear waste. This is generally thought to require fast reactors as most transuranic (TRU) isotopes have low fission probabilities in thermal reactors. Reduced-moderation LWRs are a potential alternative to fast reactors with reduced time to deployment as they are based on commercially mature LWR technology. Thorium (Th) fuel is neutronically advantageous for TRU multiple recycle in LWRs due to a large improvement in the void coefficient. If Th fuel is used in reduced-moderation LWRs, it appears neutronically feasiblemore » to achieve full actinide recycle while burning an external supply of TRU, with related potential improvements in waste management and fuel utilization. In this paper, the fuel cycle of TRU-bearing Th fuel is analysed for reduced-moderation PWRs and BWRs (RMPWRs and RBWRs). RMPWRs have the advantage of relatively rapid implementation and intrinsically low conversion ratios. However, it is challenging to simultaneously satisfy operational and fuel cycle constraints. An RBWR may potentially take longer to implement than an RMPWR due to more extensive changes from current BWR technology. However, the harder neutron spectrum can lead to favourable fuel cycle performance. A two-stage fuel cycle, where the first pass is Th-Pu MOX, is a technically reasonable implementation of either concept. The first stage of the fuel cycle can therefore be implemented at relatively low cost as a Pu disposal option, with a further policy option of full recycle in the medium term. (authors)« less
Efficient robust doubly adaptive regularized regression with applications.
Karunamuni, Rohana J; Kong, Linglong; Tu, Wei
2018-01-01
We consider the problem of estimation and variable selection for general linear regression models. Regularized regression procedures have been widely used for variable selection, but most existing methods perform poorly in the presence of outliers. We construct a new penalized procedure that simultaneously attains full efficiency and maximum robustness. Furthermore, the proposed procedure satisfies the oracle properties. The new procedure is designed to achieve sparse and robust solutions by imposing adaptive weights on both the decision loss and the penalty function. The proposed method of estimation and variable selection attains full efficiency when the model is correct and, at the same time, achieves maximum robustness when outliers are present. We examine the robustness properties using the finite-sample breakdown point and an influence function. We show that the proposed estimator attains the maximum breakdown point. Furthermore, there is no loss in efficiency when there are no outliers or the error distribution is normal. For practical implementation of the proposed method, we present a computational algorithm. We examine the finite-sample and robustness properties using Monte Carlo studies. Two datasets are also analyzed.
A phase-based stereo vision system-on-a-chip.
Díaz, Javier; Ros, Eduardo; Sabatini, Silvio P; Solari, Fabio; Mota, Sonia
2007-02-01
A simple and fast technique for depth estimation based on phase measurement has been adopted for the implementation of a real-time stereo system with sub-pixel resolution on an FPGA device. The technique avoids the attendant problem of phase warping. The designed system takes full advantage of the inherent processing parallelism and segmentation capabilities of FPGA devices to achieve a computation speed of 65megapixels/s, which can be arranged with a customized frame-grabber module to process 211frames/s at a size of 640x480 pixels. The processing speed achieved is higher than conventional camera frame rates, thus allowing the system to extract multiple estimations and be used as a platform to evaluate integration schemes of a population of neurons without increasing hardware resource demands.
Parent involvement and science achievement: A latent growth curve analysis
NASA Astrophysics Data System (ADS)
Johnson, Ursula Yvette
This study examined science achievement growth across elementary and middle school and parent school involvement using the Early Childhood Longitudinal Study - Kindergarten Class of 1998--1999 (ECLS-K). The ECLS-K is a nationally representative kindergarten cohort of students from public and private schools who attended full-day or half-day kindergarten class in 1998--1999. The present study's sample (N = 8,070) was based on students that had a sampling weight available from the public-use data file. Students were assessed in science achievement at third, fifth, and eighth grades and parents of the students were surveyed at the same time points. Analyses using latent growth curve modeling with time invariant and varying covariates in an SEM framework revealed a positive relationship between science achievement and parent involvement at eighth grade. Furthermore, there were gender and racial/ethnic differences in parents' school involvement as a predictor of science achievement. Findings indicated that students with lower initial science achievement scores had a faster rate of growth across time. The achievement gap between low and high achievers in earth, space and life sciences lessened from elementary to middle school. Parents' involvement with school usually tapers off after elementary school, but due to parent school involvement being a significant predictor of eighth grade science achievement, later school involvement may need to be supported and better implemented in secondary schooling.
Diky, Vladimir; Chirico, Robert D; Kazakov, Andrei F; Muzny, Chris D; Magee, Joseph W; Abdulagatov, Ilmutdin; Kang, Jeong Won; Kroenlein, Kenneth; Frenkel, Michael
2011-01-24
ThermoData Engine (TDE) is the first full-scale software implementation of the dynamic data evaluation concept, as reported recently in this journal. In the present paper, we describe development of an algorithmic approach to assist experiment planning through assessment of the existing body of knowledge, including availability of experimental thermophysical property data, variable ranges studied, associated uncertainties, state of prediction methods, and parameters for deployment of prediction methods and how these parameters can be obtained using targeted measurements, etc., and, indeed, how the intended measurement may address the underlying scientific or engineering problem under consideration. A second new feature described here is the application of the software capabilities for aid in the design of chemical products through identification of chemical systems possessing desired values of thermophysical properties within defined ranges of tolerance. The algorithms and their software implementation to achieve this are described. Finally, implementation of a new data validation and weighting system is described for vapor-liquid equilibrium (VLE) data, and directions for future enhancements are outlined.
1983-01-01
The resolution of the compu- and also leads to an expression for "dz,"*. tational grid is thereby defined according to e the actual requirements of...computational economy are achieved simultaneously by redistributing the computational grid points according to the physical requirements of the problem...computational Eulerian grid points according to implemented using a two-dimensionl time- the physical requirements of the nonlinear dependent finite
Study of radar pulse compression for high resolution satellite altimetry
NASA Technical Reports Server (NTRS)
Dooley, R. P.; Nathanson, F. E.; Brooks, L. W.
1974-01-01
Pulse compression techniques are studied which are applicable to a satellite altimeter having a topographic resolution of + 10 cm. A systematic design procedure is used to determine the system parameters. The performance of an optimum, maximum likelihood processor is analysed, which provides the basis for modifying the standard split-gate tracker to achieve improved performance. Bandwidth considerations lead to the recommendation of a full deramp STRETCH pulse compression technique followed by an analog filter bank to separate range returns. The implementation of the recommended technique is examined.
Parallel algorithms for large-scale biological sequence alignment on Xeon-Phi based clusters.
Lan, Haidong; Chan, Yuandong; Xu, Kai; Schmidt, Bertil; Peng, Shaoliang; Liu, Weiguo
2016-07-19
Computing alignments between two or more sequences are common operations frequently performed in computational molecular biology. The continuing growth of biological sequence databases establishes the need for their efficient parallel implementation on modern accelerators. This paper presents new approaches to high performance biological sequence database scanning with the Smith-Waterman algorithm and the first stage of progressive multiple sequence alignment based on the ClustalW heuristic on a Xeon Phi-based compute cluster. Our approach uses a three-level parallelization scheme to take full advantage of the compute power available on this type of architecture; i.e. cluster-level data parallelism, thread-level coarse-grained parallelism, and vector-level fine-grained parallelism. Furthermore, we re-organize the sequence datasets and use Xeon Phi shuffle operations to improve I/O efficiency. Evaluations show that our method achieves a peak overall performance up to 220 GCUPS for scanning real protein sequence databanks on a single node consisting of two Intel E5-2620 CPUs and two Intel Xeon Phi 7110P cards. It also exhibits good scalability in terms of sequence length and size, and number of compute nodes for both database scanning and multiple sequence alignment. Furthermore, the achieved performance is highly competitive in comparison to optimized Xeon Phi and GPU implementations. Our implementation is available at https://github.com/turbo0628/LSDBS-mpi .
Population health outcome models in suicide prevention policy.
Lynch, Frances L
2014-09-01
Suicide is a leading cause of death in the U.S. and results in immense suffering and significant cost. Effective suicide prevention interventions could reduce this burden, but policy makers need estimates of health outcomes achieved by alternative interventions to focus implementation efforts. To illustrate the utility of health outcome models to help in achieving goals defined by the National Action Alliance for Suicide Prevention's Research Prioritization Task Force. The approach is illustrated specifically with psychotherapeutic interventions to prevent suicide reattempt in emergency department settings. A health outcome model using decision analysis with secondary data was applied to estimate suicide attempts and deaths averted from evidence-based interventions. Under optimal conditions, the model estimated that over 1 year, implementing evidence-based psychotherapeutic interventions in emergency departments could decrease the number of suicide attempts by 18,737, and if offered over 5 years, it could avert 109,306 attempts. Over 1 year, the model estimated 2,498 fewer deaths from suicide, and over 5 years, about 13,928 fewer suicide deaths. Health outcome models could aid in suicide prevention policy by helping focus implementation efforts. Further research developing more sophisticated models of the impact of suicide prevention interventions that include a more complex understanding of suicidal behavior, longer time frames, and inclusion of additional outcomes that capture the full benefits and costs of interventions would be helpful next steps. Copyright © 2014 American Journal of Preventive Medicine. All rights reserved.
Stacpoole, Min; Hockley, Jo; Thompsell, Amanda; Simard, Joyce; Volicer, Ladislav
2017-10-01
Increasing numbers of older people with advanced dementia are cared for in care homes. No cure is available, so research focused on improving quality of life and quality of care for people with dementia is needed to support them to live and die well. The Namaste Care programme is a multi-dimensional care program with sensory, psycho-social and spiritual components intended to enhance quality of life and quality of care for people with advanced dementia. The aim of the study was to establish whether the Namaste Care program can be implemented in UK care homes; and what effect Namaste Care has on the quality of life of residents with advanced dementia, their families and staff. This article explores the qualitative findings of the study, reporting the effect of the programme on the families of people with advanced dementia and care home staff, and presenting their perceptions of change in care. An organisational action research methodology was used. Focus groups and interviews were undertaken pre/post implementation of the Namaste Care program. The researcher kept a reflective diary recording data on the process of change. A comments book was available to staff and relatives in each care home. Data was analysed thematically within each care home and then across all care homes. Six care homes were recruited in south London: one withdrew before the study was underway. Of the five remaining care homes, four achieved a full Namaste Care program. One care home did not achieve the full program during the study, and another discontinued Namaste Care when the study ended. Every home experienced management disruption during the study. Namaste Care challenged normal routinised care for older people with advanced dementia. The characteristics of care uncovered before Namaste was implemented were: chaos and confusion, rushing around, lack of trust, and rewarding care. After the programme was implemented these perceptions were transformed, and themes of calmness, reaching out to each other, seeing the person, and, enhanced well-being, emerged. Namaste Care can enrich the quality of life of older people with advanced dementia in care homes. The program was welcomed by care home staff and families, and was achieved with only modest expenditure and no change in staffing levels. The positive impact on residents quality of life influenced the well-being of family carers. Care staff found the changes in care enjoyable and rewarding. Namaste Care was valued for the benefits seen in residents; the improvement in relationships; and the shift towards a person-centred, relationship-based culture of care brought about by introducing the program. Namaste Care deserves further exploration and investigation including a randomised controlled trial.
Evaluation of the implementation of the Montreal At Home/Chez Soi project.
Fleury, Marie-Josée; Grenier, Guy; Vallée, Catherine
2014-11-28
Homelessness and mental disorders constitute a major problem in Canada. The purpose of the At Home/Chez Soi pilot project was to house and provide supports to marginalised groups. Policymakers are in a better position to nurture new, complex interventions if they know which key factors hinder or enable their implementation. This paper evaluates the implementation process for the Montreal site of this project. We collected data from 62 individuals, through individual interviews, focus groups, questionnaires, observations and documentation. The implementation process was analysed using a conceptual framework with five constructs: Intervention Characteristics (IC), Context of Implementation (CI), Implementation Process (IP), Organizational Characteristics (OC) and Strategies of Implementation (SI). The most serious obstacle to the project came from the CI construct, i.e., lack of support from provincial authorities and key local resources in the homelessness field. The second was within the OC construct. The chief hindrances were numerous structures, divergent values among stakeholders, frequent turnover of personnel and team leaders; lacking staff supervision and miscommunication. The third is related to IC: the complex, unyielding nature of the project undermined its chances of success. The greatest challenges from IP were the pressure to perform, along with stress caused by planning, deadlines and tension between teams. Conversely, SI construct conditions (e.g., effective governing structures, comprehensive training initiatives and toolkits) were generally very positive even with problems in power sharing and local leadership. For the four other constructs, the following proved useful: evidence of the project's scope and quality, great needs of services consolidation, generous financing and status as a research pilot project, enthusiasm and commitment toward the project, substantially improved services, and overall user satisfaction. This study demonstrated the difficulty of implementing a complex project in the healthcare system. While the project faced many barriers, minimal conditions were also achieved. At the end of the study period, major tensions between organizations and teams were significantly reduced, supporting its full implementation. However, in late 2013, the project was unsustainable, calling into question the relevance of achieving a significant number of positive conditions in each area of the framework.
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Adams, David L.; Trinidad, P. Paul
1997-01-01
NASA Langley Technical Library has been involved in developing systems for full-text information delivery of NACA/NASA technical reports since 1991. This paper will describe the two prototypes it has developed and the present production system configuration. The prototype systems are a NACA CD-ROM of thirty-three classic paper NACA reports and a network-based Full-text Electronic Reports Documents System (FEDS) constructed from both paper and electronic formats of NACA and NASA reports. The production system is the DigiDoc System (DIGItal Documents) presently being developed based on the experiences gained from the two prototypes. DigiDoc configuration integrates the on-line catalog database World Wide Web interface and PDF technology to provide a powerful and flexible search and retrieval system. It describes in detail significant achievements and lessons learned in terms of data conversion, storage technologies, full-text searching and retrieval, and image databases. The conclusions from the experiences of digitization and full- text access and future plans for DigiDoc system implementation are discussed.
Extending the Peak Bandwidth of Parameters for Softmax Selection in Reinforcement Learning.
Iwata, Kazunori
2016-05-11
Softmax selection is one of the most popular methods for action selection in reinforcement learning. Although various recently proposed methods may be more effective with full parameter tuning, implementing a complicated method that requires the tuning of many parameters can be difficult. Thus, softmax selection is still worth revisiting, considering the cost savings of its implementation and tuning. In fact, this method works adequately in practice with only one parameter appropriately set for the environment. The aim of this paper is to improve the variable setting of this method to extend the bandwidth of good parameters, thereby reducing the cost of implementation and parameter tuning. To achieve this, we take advantage of the asymptotic equipartition property in a Markov decision process to extend the peak bandwidth of softmax selection. Using a variety of episodic tasks, we show that our setting is effective in extending the bandwidth and that it yields a better policy in terms of stability. The bandwidth is quantitatively assessed in a series of statistical tests.
Baldwin, Constance D; Chandran, Latha; Gusic, Maryellen E
2017-01-01
Multisite and national professional development (PD) programs for educators are challenging to establish. Use of implementation science (IS) frameworks designed to convert evidence-based intervention methods into effective health care practice may help PD developers translate proven educational methods and models into successful, well-run programs. Implementation of the national Educational Scholars Program (ESP) is used to illustrate the value of the IS model. Four adaptable elements of IS are described: (1) replication of an evidence-based model, (2) systematic stages of implementation, (3) management of implementation using three implementation drivers, and (4) demonstration of program success through measures of fidelity to proven models and sustainability. Implementation of the ESP was grounded on five established principles and methods for successful PD. The process was conducted in four IS stages over 10 years: Exploration, Installation, Initial Implementation, and Full Implementation. To ensure effective and efficient processes, attention to IS implementation drivers helped to manage organizational relationships, build competence in faculty and scholars, and address leadership challenges. We describe the ESP's fidelity to evidence-based structures and methods, and offer three examples of sustainability efforts that enabled achievement of targeted program outcomes, including academic productivity, strong networking, and career advancement of scholars. Application of IS frameworks to program implementation may help other PD programs to translate evidence-based methods into interventions with enhanced impact. A PD program can follow systematic developmental stages and be operationalized by practical implementation drivers, thereby creating successful and sustainable interventions that promote the academic vitality of health professions educators.
Superadiabatic driving of a three-level quantum system
NASA Astrophysics Data System (ADS)
Theisen, M.; Petiziol, F.; Carretta, S.; Santini, P.; Wimberger, S.
2017-07-01
We study superadiabatic quantum control of a three-level quantum system whose energy spectrum exhibits multiple avoided crossings. In particular, we investigate the possibility of treating the full control task in terms of independent two-level Landau-Zener problems. We first show that the time profiles of the elements of the full control Hamiltonian are characterized by peaks centered around the crossing times. These peaks decay algebraically for large times. In principle, such a power-law scaling invalidates the hypothesis of perfect separability. Nonetheless, we address the problem from a pragmatic point of view by studying the fidelity obtained through separate control as a function of the intercrossing separation. This procedure may be a good approach to achieve approximate adiabatic driving of a specific instantaneous eigenstate in realistic implementations.
NASA Astrophysics Data System (ADS)
Telesco, C. M.; Sparks, W. B.; Zhao, B.; Varosi, F.; Schofield, S.; Germer, T. A.; Kolokolova, L.; Parenteau, M. N.; Cooper, G.; Grundy, W. M.; Guzmán, R.; Pantin, E.
2016-12-01
Optical spectropolarimetry holds great promise in the search for extraterrestrial life. In particular, the detection of circular polarization can indicate chirality, a signature of biological significance. We describe an on-going effort to implement the full-Stokes (I, Q, U, V), static-optics concept for optical spectropolarimetry described by Sparks et al. [App. Optics, 51, 5495 (2012)]. Our early breadboard embodiments of the concept demonstrate its simplicity and indicate its potential for space missions in which a compact design with no moving parts is crucial to achieve the mission goals. We describe the instrument, called the Integrated Miniature Polarimeter and Spectrograph (IMPS), and consider one example for its deployment: a mission to land on an outer solar system body such as Europa.
Leveraging Mindsets to Promote Academic Achievement: Policy Recommendations.
Rattan, Aneeta; Savani, Krishna; Chugh, Dolly; Dweck, Carol S
2015-11-01
The United States must improve its students' educational achievement. Race, gender, and social class gaps persist, and, overall, U.S. students rank poorly among peers globally. Scientific research shows that students' psychology-their "academic mindsets"-have a critical role in educational achievement. Yet policymakers have not taken full advantage of cost-effective and well-validated mindset interventions. In this article, we present two key academic mindsets. The first, a growth mindset, refers to the belief that intelligence can be developed over time. The second, a belonging mindset, refers to the belief that people like you belong in your school or in a given academic field. Extensive research shows that fostering these mindsets can improve students' motivation; raise grades; and reduce racial, gender, and social class gaps. Of course, mindsets are not a panacea, but with proper implementation they can be an excellent point of entry. We show how policy at all levels (federal, state, and local) can leverage mindsets to lift the nation's educational outcomes. © The Author(s) 2015.
Sturke, Rachel; Harmston, Christine; Simonds, R J; Mofenson, Lynne M; Siberry, George K; Watts, D Heather; McIntyre, James; Anand, Nalini; Guay, Laura; Castor, Delivette; Brouwers, Pim; Nagel, Joan D
2014-11-01
In resource-limited countries, interventions to prevent mother-to-child HIV transmission (PMTCT) have not yet realized their full potential health impact, illustrating the common gap between the scientific proof of an intervention's efficacy and effectiveness and its successful implementation at scale into routine health services. For PMTCT, this gap results, in part, from inadequate adaptation of PMTCT interventions to the realities of the implementation environment, including client and health care worker behaviors and preferences, health care policies and systems, and infrastructure and resource constraints. Elimination of mother-to-child HIV transmission can only be achieved through understanding of key implementation barriers and successful adaptation of scientifically proven interventions to the local environment. Central to such efforts is implementation science (IS), which aims to investigate and address major bottlenecks that impede effective implementation and to test new approaches to identifying, understanding, and overcoming barriers to the adoption, adaptation, integration, scale-up, and sustainability of evidence-based interventions. Advancing IS will require deliberate and strategic efforts to facilitate collaboration, communication, and relationship-building among researchers, implementers, and policy-makers. To speed the translation of effective PMTCT interventions into practice and advance IS more broadly, the US National Institutes of Health, in collaboration with the President's Emergency Plan for AIDS Relief launched the National Institutes of Health/President's Emergency Plan for AIDS Relief PMTCT IS Alliance, comprised of IS researchers, PMTCT program implementers, and policy-makers as an innovative platform for interaction and coordination.
Sobriety Treatment and Recovery Teams: Implementation Fidelity and Related Outcomes.
Huebner, Ruth A; Posze, Lynn; Willauer, Tina M; Hall, Martin T
2015-01-01
Although integrated programs between child welfare and substance abuse treatment are recommended for families with co-occurring child maltreatment and substance use disorders, implementing integrated service delivery strategies with fidelity is a challenging process. This study of the first five years of the Sobriety Treatment and Recovery Team (START) program examines implementation fidelity using a model proposed by Carroll et al. (2007). The study describes the process of strengthening moderators of implementation fidelity, trends in adherence to START service delivery standards, and trends in parent and child outcomes. Qualitative and quantitative measures were used to prospectively study three START sites serving 341 families with 550 parents and 717 children. To achieve implementation fidelity to service delivery standards required a pre-service year and two full years of operation, persistent leadership, and facilitative actions that challenged the existing paradigm. Over four years of service delivery, the time from the child protective services report to completion of five drug treatment sessions was reduced by an average of 75 days. This trend was associated with an increase in parent retention, parental sobriety, and parent retention of child custody. Conclusions/Importance: Understanding the implementation processes necessary to establish complex integrated programs may support realistic allocation of resources. Although implementation fidelity is a moderator of program outcome, complex inter-agency interventions may benefit from innovative measures of fidelity that promote improvement without extensive cost and data collection burden. The implementation framework applied in this study was useful in examining implementation processes, fidelity, and related outcomes.
ERIC Educational Resources Information Center
Palmer, Porter W.
2009-01-01
Since Federal regulations have given states the option to implement alternate assessments based on modified academic achievement standards (AA-MAS) as part of their accountability systems for a small group of students with disabilities, a number of states have made decisions about whether or not to develop and implement such an assessment.…
Stereo and IMU-Assisted Visual Odometry for Small Robots
NASA Technical Reports Server (NTRS)
2012-01-01
This software performs two functions: (1) taking stereo image pairs as input, it computes stereo disparity maps from them by cross-correlation to achieve 3D (three-dimensional) perception; (2) taking a sequence of stereo image pairs as input, it tracks features in the image sequence to estimate the motion of the cameras between successive image pairs. A real-time stereo vision system with IMU (inertial measurement unit)-assisted visual odometry was implemented on a single 750 MHz/520 MHz OMAP3530 SoC (system on chip) from TI (Texas Instruments). Frame rates of 46 fps (frames per second) were achieved at QVGA (Quarter Video Graphics Array i.e. 320 240), or 8 fps at VGA (Video Graphics Array 640 480) resolutions, while simultaneously tracking up to 200 features, taking full advantage of the OMAP3530's integer DSP (digital signal processor) and floating point ARM processors. This is a substantial advancement over previous work as the stereo implementation produces 146 Mde/s (millions of disparities evaluated per second) in 2.5W, yielding a stereo energy efficiency of 58.8 Mde/J, which is 3.75 better than prior DSP stereo while providing more functionality.
Working towards TB elimination the WHO Regional Strategic Plan (2006-2015).
Nair, Nani; Cooreman, Erwin
2006-03-01
DOTS has expanded rapidly in the South-East Asia Region over the period of the Partnership's first Global Plan (2001-2005), with almost 100% geographical coverage achieved in 2005. All countries have made impressive progress in improving coverage and quality. This progress has been made possible through strong political commitment and large investments in TB control for improved infrastructure, reliable drug supply, increased staffing, improved laboratory services, and intensified training and supervision. Accomplishing the objectives outlined in this document will require sustaining the progress in all countries and particularly in the five high burden countries for achieving major regional and global impact. National TB programmes will need to be supported to maintain or surpass the 70% case detection and 85% treatment success rates. The achievement of the TB-related targets linked to the MDGs will also depend on how effectively initiatives such as DOTS-Plus, PPM DOTS and interventions for TB/ HIV among others, are implemented. National governments and development partners must fulfill their commitments to mobilizing and sustaining adequate resources to support the full range of activities envisaged. The benefits of full and effective implementation of all the planned interventions would be substantial. These will result in 20 to 25 million TB cases being treated in DOTS program mes and more than 150 000 drug-resistant cases receiving treatment through DOTS-Plus during the period 2006-2015. In addition, at least 250 000 HIV-infected TB patients may also receive anti-retroviral therapy. As a consequence, the prevalence of TB is expected to fall below 175/100 000 and the number of TB deaths is expected to fall to between 100 000 and 150 000 per year. There would also be substantial economic benefits given that TB disproportionately affects adults in their most productive years. Considering these aspects, it is expected that the TB incidence will decline significantly during this period so that the Millennium Development Goals would be met by or ahead of 2015.
Redefining Health: Implication for Value-Based Healthcare Reform.
Putera, Ikhwanuliman
2017-03-02
Health definition consists of three domains namely, physical, mental, and social health that should be prioritized in delivering healthcare. The emergence of chronic diseases in aging populations has been a barrier to the realization of a healthier society. The value-based healthcare concept seems in line with the true health objective: increasing value. Value is created from health outcomes which matter to patients relative to the cost of achieving those outcomes. The health outcomes should include all domains of health in a full cycle of care. To implement value-based healthcare, transformations need to be done by both health providers and patients: establishing true health outcomes, strengthening primary care, building integrated health systems, implementing appropriate health payment schemes that promote value and reduce moral hazards, enabling health information technology, and creating a policy that fits well with a community.
Energy sweep compensation of induction accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sampayan, S.E.; Caporaso, G.J.; Chen, Y-J
1990-09-12
The ETA-II linear induction accelerator (LIA) is designed to drive a microwave free electron laser (FEL). Beam energy sweep must be limited to {plus minus}1% for 50 ns to limit beam corkscrew motion and ensure high power FEL output over the full duration of the beam flattop. To achieve this energy sweep requirement, we have implemented a pulse distribution system and are planning implementation of a tapered pulse forming line (PFL) in the pulse generators driving acceleration gaps. The pulse distribution system assures proper phasing of the high voltage pulse to the electron beam. Additionally, cell-to-cell coupling of beam inducedmore » transients is reduced. The tapered PFL compensates for accelerator cell and loading nonlinearities. Circuit simulations show good agreement with preliminary data and predict the required energy sweep requirement can be met.« less
An Ultralow-Power Sleep Spindle Detection System on Chip.
Iranmanesh, Saam; Rodriguez-Villegas, Esther
2017-08-01
This paper describes a full system-on-chip to automatically detect sleep spindle events from scalp EEG signals. These events, which are known to play an important role on memory consolidation during sleep, are also characteristic of a number of neurological diseases. The operation of the system is based on a previously reported algorithm, which used the Teager energy operator, together with the Spectral Edge Frequency (SEF50) achieving more than 70% sensitivity and 98% specificity. The algorithm is now converted into a hardware analog based customized implementation in order to achieve extremely low levels of power. Experimental results prove that the system, which is fabricated in a 0.18 μm CMOS technology, is able to operate from a 1.25 V power supply consuming only 515 nW, with an accuracy that is comparable to its software counterpart.
Evaluating Math Recovery: Measuring Fidelity of Implementation
ERIC Educational Resources Information Center
Munter, Charles; Garrison, Anne; Cobb, Paul; Cordray, David
2010-01-01
In this paper, the authors describe a case of measuring implementation fidelity within an evaluation study of Math Recovery (MR), a pullout tutoring program aimed at increasing the mathematics achievement of low-performing first graders, thereby closing the school-entry achievement gap by enabling them to achieve at the level of their…
Flash flood warning based on fully dynamic hydrology modelling
NASA Astrophysics Data System (ADS)
Pejanovic, Goran; Petkovic, Slavko; Cvetkovic, Bojan; Nickovic, Slobodan
2016-04-01
Numerical hydrologic modeling has achieved limited success in the past due to, inter alia, lack of adequate input data. Over the last decade, data availability has improved substantially. For modelling purposes, high-resolution data on topography, river routing, and land cover and soil features have meanwhile become available, as well as the observations such as radar precipitation information. In our study, we have implemented the HYPROM model (Hydrology Prognostic Model) to predict a flash flood event at a smaller-scale basin in Southern Serbia. HYPROM is based on the full set of governing equations for surface hydrological dynamics, in which momentum components, along with the equation of mass continuity, are used as full prognostic equations. HYPROM also includes a river routing module serving as a collector for the extra surface water. Such approach permits appropriate representation of different hydrology scales ranging from flash floods to flows of large and slow river basins. The use of full governing equations, if not appropriately parameterized, may lead to numerical instability systems when the surface water in a model is vanishing. To resolve these modelling problems, an unconditionally stable numerical scheme and a method for height redistribution avoiding shortwave height noise have been developed in HYPROM, which achieve numerical convergence of u, v and h when surface water disappears. We have applied HYPROM, driven by radar-estimated precipitation, to predict flash flooding occurred over smaller and medium-size river basins. Two torrential rainfall cases have been simulated to check the accuracy of the model: the exceptional flooding of May 2014 in Western Serbia, and the convective flash flood of January 2015 in Southern Serbia. The second episode has been successfully predicted by HYPROM in terms of timing and intensity six hours before the event occurred. Such flash flood warning system is in preparation to be operationally implemented in the Republic Hydrometeorological Service of Serbia.
A geospatial soil-based DSS to reconcile landscape management and land protection
NASA Astrophysics Data System (ADS)
Manna, Piero; Basile, Angelo; Bonfante, Antonello; D'Antonio, Amedeo; De Michele, Carlo; Iamarino, Michela; Langella, Giuliano; Florindo Mileti, Antonio; Pileri, Paolo; Vingiani, Simona; Terribile, Fabio
2017-04-01
The implementation of UN Agenda 2030 may represent a great opportunity to place soil science at the hearth of many Sustainable Development Goals (e.g. SDGs 2, 3, 13, 15, 15.3, 16.7). On the other side the high complexity embedded in the factual implementation of SDG and many others ambitious objectives (e.g. FAO goals) may cause new frustrations if these policy documents will not bring real progresses. The scientific communities are asked to contribute to disentangle this complexity and possibly identifying a "way to go". This may help the large number of European directives (e.g. WFD, EIA), regulations and communications aiming to achieve a better environment but still facing large difficulties in their full implementation (e.g. COM2015/120; COM2013/683). This contribution has the motivation to provide a different perspective, thinking that the full implementation of SDGs and integrated land policies requires to challenge some key overlooked issues including full competence (and managing capability) about the landscape variability, its multi-functionalities (e.g. agriculture / environment) and its dynamic nature (many processes, including crops growth and fate of pollutants, are dynamic); moreover, it requires to support actions at a very detailed local scale since many processes and problems are site specific. The landscape and all the above issues have the soil as pulsing heart. Accordingly, we aim to demonstrate the multiple benefits in using a smart geoSpatial Decision Support System (S-DSS) grounded on soil modelling, called SOILCONSWEB (EU LIFE+ project and its extensions). It is a freely-accessible web platform based on a Geospatial Cyber-Infrastructure (GCI) and developed in Valle Telesina (South Italy) over an area of 20,000 ha. It supports a multilevel decision-making in agriculture and environment including the interaction with other land uses (such as landscape and urban planning) and thus it simultaneously delivers to SDGs 2, 3, 13, 15, 15.3, 16.7.
Performance of the SIR-B digital image processing subsystem
NASA Technical Reports Server (NTRS)
Curlander, J. C.
1986-01-01
A ground-based system to generate digital SAR image products has been developed and implemented in support of the SIR-B mission. This system is designed to achieve the maximum throughput while meeting strict image fidelity criteria. Its capabilities include: automated radiometric and geometric correction of the output imagery; high-precision absolute location without tiepoint registration; filtering of the raw data to remove spurious signals from alien radars; and automated catologing to maintain a full set of radar and image production facility in support of the SIR-B science investigators routinely produces over 80 image frames per week.
Mesoscale research activities with the LAMPS model
NASA Technical Reports Server (NTRS)
Kalb, M. W.
1985-01-01
Researchers achieved full implementation of the LAMPS mesoscale model on the Atmospheric Sciences Division computer and derived balanced and real wind initial states for three case studies: March 6, April 24, April 26, 1982. Numerical simulations were performed for three separate studies: (1) a satellite moisture data impact study using Vertical Atmospheric Sounder (VAS) precipitable water as a constraint on model initial state moisture analyses; (2) an evaluation of mesoscale model precipitation simulation accuracy with and without convective parameterization; and (3) the sensitivity of model precipitation to mesoscale detail of moisture and vertical motion in an initial state.
NASA Astrophysics Data System (ADS)
Pongsophon, Pongprapan; Herman, Benjamin C.
2017-07-01
Given the abundance of literature describing the strong relationship between inquiry-based teaching and student achievement, more should be known about the factors impacting science teachers' classroom inquiry implementation. This study utilises the theory of planned behaviour to propose and validate a causal model of inquiry-based teaching through analysing data relating to high-performing countries retrieved from the 2011 Trends in International Mathematics and Science Study assessments. Data analysis was completed through structural equation modelling using a polychoric correlation matrix for data input and diagonally weighted least squares estimation. Adequate fit of the full model to the empirical data was realised. The model demonstrates that the extent the teachers participated in academic collaborations was positively related to their occupational satisfaction, confidence in teaching inquiry, and classroom inquiry practices. Furthermore, the teachers' confidence with implementing inquiry was positively related to their classroom inquiry implementation and occupational satisfaction. However, perceived student-generated constraints demonstrated a negative relationship with the teachers' confidence with implementing inquiry and occupational satisfaction. Implications from this study include supporting teachers through promoting collaborative opportunities that facilitate inquiry-based practices and occupational satisfaction.
FPGA Implementation of Metastability-Based True Random Number Generator
NASA Astrophysics Data System (ADS)
Hata, Hisashi; Ichikawa, Shuichi
True random number generators (TRNGs) are important as a basis for computer security. Though there are some TRNGs composed of analog circuit, the use of digital circuits is desired for the application of TRNGs to logic LSIs. Some of the digital TRNGs utilize jitter in free-running ring oscillators as a source of entropy, which consume large power. Another type of TRNG exploits the metastability of a latch to generate entropy. Although this kind of TRNG has been mostly implemented with full-custom LSI technology, this study presents an implementation based on common FPGA technology. Our TRNG is comprised of logic gates only, and can be integrated in any kind of logic LSI. The RS latch in our TRNG is implemented as a hard-macro to guarantee the quality of randomness by minimizing the signal skew and load imbalance of internal nodes. To improve the quality and throughput, the output of 64-256 latches are XOR'ed. The derived design was verified on a Xilinx Virtex-4 FPGA (XC4VFX20), and passed NIST statistical test suite without post-processing. Our TRNG with 256 latches occupies 580 slices, while achieving 12.5Mbps throughput.
Marchi, A; Geerts, S; Weemaes, M; Schiettecatte, W; Wim, S; Vanhoof, C; Christine, V
2015-01-01
To date, phosphorus recovery as struvite in wastewater treatment plants has been mainly implemented on water phases resulting from dewatering processes of the sludge line. However, it is possible to recover struvite directly from sludge phases. Besides minimising the return loads of phosphorus from the sludge line to the water line, placing such a process within the sludge line is claimed to offer advantages such as a higher recovery potential, enhanced dewaterability of the treated sludge, and reduced speed of scaling in pipes and dewatering devices. In the wastewater treatment plant at Leuven (Belgium), a full-scale struvite recovery process from digested sludge has been tested for 1 year. Several monitoring campaigns and experiments provided indications of the efficiency of the process for recovery. The load of phosphorus from the sludge line returning to the water line as centrate accounted for 15% of the P-load of the plant in the reference situation. Data indicated that the process divides this phosphorus load by two. An improved dewaterability of 1.5% of dry solids content was achieved, provided a proper tuning of the installation. Quality analyses showed that the formed struvite was quite pure.
NASA Technical Reports Server (NTRS)
Kamhawi, Hani; Haag, Thomas; Huang, Wensheng; Shastry, Rohit; Pinero, Luis; Peterson, Todd; Mathers, Alex
2012-01-01
NASA Science Mission Directorate's In-Space Propulsion Technology Program is sponsoring the development of a 3.5 kW-class engineering development unit Hall thruster for implementation in NASA science and exploration missions. NASA Glenn and Aerojet are developing a high fidelity high voltage Hall accelerator that can achieve specific impulse magnitudes greater than 2,700 seconds and xenon throughput capability in excess of 300 kilograms. Performance, plume mappings, thermal characterization, and vibration tests of the high voltage Hall accelerator engineering development unit have been performed. Performance test results indicated that at 3.9 kW the thruster achieved a total thrust efficiency and specific impulse of 58%, and 2,700 sec, respectively. Thermal characterization tests indicated that the thruster component temperatures were within the prescribed material maximum operating temperature limits during full power thruster operation. Finally, thruster vibration tests indicated that the thruster survived the 3-axes qualification full-level random vibration test series. Pre and post-vibration test performance mappings indicated almost identical thruster performance. Finally, an update on the development progress of a power processing unit and a xenon feed system is provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharada, Shaama Mallikarjun; Bell, Alexis T., E-mail: mhg@bastille.cchem.berkeley.edu, E-mail: bell@cchem.berkeley.edu; Head-Gordon, Martin, E-mail: mhg@bastille.cchem.berkeley.edu, E-mail: bell@cchem.berkeley.edu
2014-04-28
The cost of calculating nuclear hessians, either analytically or by finite difference methods, during the course of quantum chemical analyses can be prohibitive for systems containing hundreds of atoms. In many applications, though, only a few eigenvalues and eigenvectors, and not the full hessian, are required. For instance, the lowest one or two eigenvalues of the full hessian are sufficient to characterize a stationary point as a minimum or a transition state (TS), respectively. We describe here a method that can eliminate the need for hessian calculations for both the characterization of stationary points as well as searches for saddlemore » points. A finite differences implementation of the Davidson method that uses only first derivatives of the energy to calculate the lowest eigenvalues and eigenvectors of the hessian is discussed. This method can be implemented in conjunction with geometry optimization methods such as partitioned-rational function optimization (P-RFO) to characterize stationary points on the potential energy surface. With equal ease, it can be combined with interpolation methods that determine TS guess structures, such as the freezing string method, to generate approximate hessian matrices in lieu of full hessians as input to P-RFO for TS optimization. This approach is shown to achieve significant cost savings relative to exact hessian calculation when applied to both stationary point characterization as well as TS optimization. The basic reason is that the present approach scales one power of system size lower since the rate of convergence is approximately independent of the size of the system. Therefore, the finite-difference Davidson method is a viable alternative to full hessian calculation for stationary point characterization and TS search particularly when analytical hessians are not available or require substantial computational effort.« less
Passive detection of vehicle loading
NASA Astrophysics Data System (ADS)
McKay, Troy R.; Salvaggio, Carl; Faulring, Jason W.; Salvaggio, Philip S.; McKeown, Donald M.; Garrett, Alfred J.; Coleman, David H.; Koffman, Larry D.
2012-01-01
The Digital Imaging and Remote Sensing Laboratory (DIRS) at the Rochester Institute of Technology, along with the Savannah River National Laboratory is investigating passive methods to quantify vehicle loading. The research described in this paper investigates multiple vehicle indicators including brake temperature, tire temperature, engine temperature, acceleration and deceleration rates, engine acoustics, suspension response, tire deformation and vibrational response. Our investigation into these variables includes building and implementing a sensing system for data collection as well as multiple full-scale vehicle tests. The sensing system includes; infrared video cameras, triaxial accelerometers, microphones, video cameras and thermocouples. The full scale testing includes both a medium size dump truck and a tractor-trailer truck on closed courses with loads spanning the full range of the vehicle's capacity. Statistical analysis of the collected data is used to determine the effectiveness of each of the indicators for characterizing the weight of a vehicle. The final sensing system will monitor multiple load indicators and combine the results to achieve a more accurate measurement than any of the indicators could provide alone.
Fitness for duty in the nuclear power industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durbin, N.; Moore, C.; Grant, T.
1991-09-01
This report presents an overview of the NRC licensees' implementation of the FFD program during the first full year of the program's operation and provides new information on a variety of FFD technical issues. The purpose of this document is to contribute to appropriate changes to the rule, to the inspection process, and to other NRC activities. It describes the characteristics of licensee programs, discusses the results of NRC inspections, updates technical information covered in previous reports, and identifies lessons learned during the first year. Overall, the experience of the first full year of licensees' FFD program operations indicates thatmore » licensees have functioning fitness for duty programs devoted to the NRC rule's performance objectives of achieving drug-free workplaces in which nuclear power plant personnel are not impaired as they perform their duties. 96 refs., 14 tabs.« less
ERIC Educational Resources Information Center
Garcia, David L.
2008-01-01
This article uses state-level achievement data to examine the academic progress of Arizona American Indian elementary public school students before and since the implementation of the No Child Left Behind (NCLB) Act. In most subjects and grades, American Indian students are making greater progress since the implementation of NCLB. Generally,…
Control Reallocation Strategies for Damage Adaptation in Transport Class Aircraft
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen; Krishnakumar, K.; Limes, Greg; Bryant, Don
2003-01-01
This paper examines the feasibility, potential benefits and implementation issues associated with retrofitting a neural-adaptive flight control system (NFCS) to existing transport aircraft, including both cable/hydraulic and fly-by-wire configurations. NFCS uses a neural network based direct adaptive control approach for applying alternate sources of control authority in the presence of damage or failures in order to achieve desired flight control performance. Neural networks are used to provide consistent handling qualities across flight conditions, adapt to changes in aircraft dynamics and to make the controller easy to apply when implemented on different aircraft. Full-motion piloted simulation studies were performed on two different transport models: the Boeing 747-400 and the Boeing C-17. Subjects included NASA, Air Force and commercial airline pilots. Results demonstrate the potential for improving handing qualities and significantly increased survivability rates under various simulated failure conditions.
Low-level rf control of Spallation Neutron Source: System and characterization
NASA Astrophysics Data System (ADS)
Ma, Hengjie; Champion, Mark; Crofford, Mark; Kasemir, Kay-Uwe; Piller, Maurice; Doolittle, Lawrence; Ratti, Alex
2006-03-01
The low-level rf control system currently commissioned throughout the Spallation Neutron Source (SNS) LINAC evolved from three design iterations over 1 yr intensive research and development. Its digital hardware implementation is efficient, and has succeeded in achieving a minimum latency of less than 150 ns which is the key for accomplishing an all-digital feedback control for the full bandwidth. The control bandwidth is analyzed in frequency domain and characterized by testing its transient response. The hardware implementation also includes the provision of a time-shared input channel for a superior phase differential measurement between the cavity field and the reference. A companion cosimulation system for the digital hardware was developed to ensure a reliable long-term supportability. A large effort has also been made in the operation software development for the practical issues such as the process automations, cavity filling, beam loading compensation, and the cavity mechanical resonance suppression.
OpenMC In Situ Source Convergence Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldrich, Garrett Allen; Dutta, Soumya; Woodring, Jonathan Lee
2016-05-07
We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are ablemore » to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.« less
Wideband energy harvesting for piezoelectric devices with linear resonant behavior.
Luo, Cheng; Hofmann, Heath F
2011-07-01
In this paper, an active energy harvesting technique for a spring-mass-damper mechanical resonator with piezoelectric electromechanical coupling is investigated. This technique applies a square-wave voltage to the terminals of the device at the same frequency as the mechanical excitation. By controlling the magnitude and phase angle of this voltage, an effective impedance matching can be achieved which maximizes the amount of power extracted from the device. Theoretically, the harvested power can be the maximum possible value, even at off-resonance frequencies. However, in actual implementation, the efficiency of the power electronic circuit limits the amount of power harvested. A power electronic full-bridge converter is built to implement the technique. Experimental results show that the active technique can increase the effective bandwidth by a factor of more than 2, and harvests significantly higher power than rectifier-based circuits at off-resonance frequencies.
Schwamm, Lee H
2014-02-01
"Telehealth" refers to the use of electronic services to support a broad range of remote services, such as patient care, education, and monitoring. Telehealth must be integrated into traditional ambulatory and hospital-based practices if it is to achieve its full potential, including addressing the six domains of care quality defined by the Institute of Medicine: safe, effective, patient-centered, timely, efficient, and equitable. Telehealth is a disruptive technology that appears to threaten traditional health care delivery but has the potential to reform and transform the industry by reducing costs and increasing quality and patient satisfaction. This article outlines seven strategies critical to successful telehealth implementation: understanding patients' and providers' expectations, untethering telehealth from traditional revenue expectations, deconstructing the traditional health care encounter, being open to discovery, being mindful of the importance of space, redesigning care to improve value in health care, and being bold and visionary.
Strong scaling of general-purpose molecular dynamics simulations on GPUs
NASA Astrophysics Data System (ADS)
Glaser, Jens; Nguyen, Trung Dac; Anderson, Joshua A.; Lui, Pak; Spiga, Filippo; Millan, Jaime A.; Morse, David C.; Glotzer, Sharon C.
2015-07-01
We describe a highly optimized implementation of MPI domain decomposition in a GPU-enabled, general-purpose molecular dynamics code, HOOMD-blue (Anderson and Glotzer, 2013). Our approach is inspired by a traditional CPU-based code, LAMMPS (Plimpton, 1995), but is implemented within a code that was designed for execution on GPUs from the start (Anderson et al., 2008). The software supports short-ranged pair force and bond force fields and achieves optimal GPU performance using an autotuning algorithm. We are able to demonstrate equivalent or superior scaling on up to 3375 GPUs in Lennard-Jones and dissipative particle dynamics (DPD) simulations of up to 108 million particles. GPUDirect RDMA capabilities in recent GPU generations provide better performance in full double precision calculations. For a representative polymer physics application, HOOMD-blue 1.0 provides an effective GPU vs. CPU node speed-up of 12.5 ×.
Training managers for high productivity: Guidelines and a case history
NASA Technical Reports Server (NTRS)
Ranftl, R. M.
1985-01-01
Hughes Aircrafts 13-year productivity study clearly identifies management as the key link in the entire productivity chain. This fact led to the establishment of a long-term series of seminars on personal, managerial, organizational, and operational productivity for all levels and sectors of line and staff management. To inspire the work force to higher levels of productivity and creativity management, itself, must first be inspired. In turn they have to clearly understand the productive and creative processes, fashion an effective productivity improvement plan with sound strategy and implementation, create an optimal environmental chemistry, and provide the outstanding leadership necessary to propel their organizations to achieve full potential. The primary goals of the seminars are to (1) ignite that spark of inspiration, enabling productive action to follow, (2) provide participants a credible roadmap and effective tools for implementation, and (3) develop a dedicated commitment to leadership and productivity throughout the management team.
ERIC Educational Resources Information Center
Keane, Marilyn N.
2012-01-01
This study examined the relation between implementation of Positive Behavior Intervention and Supports (PBIS) and academic achievement in middle school math as measured by the Maryland State Assessment (MSA). In particular, the correlation of academic achievement in mathematics, grouped by PBIS implementation status to race, socioeconomic status…
ERIC Educational Resources Information Center
Yim, Su Yon; Cho, Young Hoan
2016-01-01
Despite the benefits of peer assessment, many teachers are not willing to implement it, particularly for low-achieving students. This study used the theory of planned behaviour to predict pre-service teachers' intention to use peer assessment for low-achieving students. A total of 229 pre-service teachers in Singapore participated in the survey…
Classroom Teachers' Perceptions of the Implementation and Effects of Full Inclusion.
ERIC Educational Resources Information Center
Sardo-Brown, Deborah; Hinson, Stephanie
1995-01-01
Describes a survey of 51 graduate students/full-time teachers at schools implementing full inclusion programs. Participants expressed their views concerning implementation methods, effects on instructional practices, community reactions, and advantages and disadvantages. Schools need to do a better job of explaining the rationale for full…
45 CFR 162.510 - Full implementation requirements: Covered entities.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Full implementation requirements: Covered entities. 162.510 Section 162.510 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA... Plans § 162.510 Full implementation requirements: Covered entities. (a) A covered entity must use an...
A simulation-based study of HighSpeed TCP and its deployment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Souza, Evandro de
2003-05-01
The current congestion control mechanism used in TCP has difficulty reaching full utilization on high speed links, particularly on wide-area connections. For example, the packet drop rate needed to fill a Gigabit pipe using the present TCP protocol is below the currently achievable fiber optic error rates. HighSpeed TCP was recently proposed as a modification of TCP's congestion control mechanism to allow it to achieve reasonable performance in high speed wide-area links. In this research, simulation results showing the performance of HighSpeed TCP and the impact of its use on the present implementation of TCP are presented. Network conditions includingmore » different degrees of congestion, different levels of loss rate, different degrees of bursty traffic and two distinct router queue management policies were simulated. The performance and fairness of HighSpeed TCP were compared to the existing TCP and solutions for bulk-data transfer using parallel streams.« less
NASA Astrophysics Data System (ADS)
Angerer, Andreas; Astner, Thomas; Wirtitsch, Daniel; Sumiya, Hitoshi; Onoda, Shinobu; Isoya, Junichi; Putz, Stefan; Majer, Johannes
2016-07-01
We design and implement 3D-lumped element microwave cavities that spatially focus magnetic fields to a small mode volume. They allow coherent and uniform coupling to electron spins hosted by nitrogen vacancy centers in diamond. We achieve large homogeneous single spin coupling rates, with an enhancement of more than one order of magnitude compared to standard 3D cavities with a fundamental resonance at 3 GHz. Finite element simulations confirm that the magnetic field distribution is homogeneous throughout the entire sample volume, with a root mean square deviation of 1.54%. With a sample containing 1017 nitrogen vacancy electron spins, we achieve a collective coupling strength of Ω = 12 MHz, a cooperativity factor C = 27, and clearly enter the strong coupling regime. This allows to interface a macroscopic spin ensemble with microwave circuits, and the homogeneous Rabi frequency paves the way to manipulate the full ensemble population in a coherent way.
Mechanism Design and Testing of a Self-Deploying Structure Using Flexible Composite Tape Springs
NASA Technical Reports Server (NTRS)
Footdale, Joseph N.; Murphey, Thomas W.
2014-01-01
The detailed mechanical design of a novel deployable support structure that positions and tensions a membrane optic for space imagining applications is presented. This is a complex three-dimensional deployment using freely deploying rollable composite tape spring booms that become load bearing structural members at full deployment. The deployment tests successfully demonstrate a new architecture based on rolled and freely deployed composite tape spring members that achieve simultaneous deployment without mechanical synchronization. Proper design of the flexible component mounting interface and constraint systems, which were critical in achieving a functioning unit, are described. These flexible composite components have much potential for advancing the state of the art in deployable structures, but have yet to be widely adopted. This paper demonstrates the feasibility and advantages of implementing flexible composite components, including the design details on how to integrate with required traditional mechanisms.
Achieving QoS for TCP Traffic in Satellite Networks with Differentiated Services
NASA Technical Reports Server (NTRS)
Durresi, Arjan; Kota, Sastri; Goyal, Mukul; Jain, Raj; Bharani, Venkata
2001-01-01
Satellite networks play an indispensable role in providing global Internet access and electronic connectivity. To achieve such a global communications, provisioning of quality of service (QoS) within the advanced satellite systems is the main requirement. One of the key mechanisms of implementing the quality of service is traffic management. Traffic management becomes a crucial factor in the case of satellite network because of the limited availability of their resources. Currently, Internet Protocol (IP) only has minimal traffic management capabilities and provides best effort services. In this paper, we presented a broadband satellite network QoS model and simulated performance results. In particular, we discussed the TCP flow aggregates performance for their good behavior in the presence of competing UDP flow aggregates in the same assured forwarding. We identified several factors that affect the performance in the mixed environments and quantified their effects using a full factorial design of experiment methodology.
Watkins, Kim; Wood, Helen; Schneider, Carl R; Clifford, Rhonda
2015-10-29
The clinical role of community pharmacists is expanding, as is the use of clinical guidelines in this setting. However, it is unclear which strategies are successful in implementing clinical guidelines and what outcomes can be achieved. The aim of this systematic review is to synthesise the literature on the implementation of clinical guidelines to community pharmacy. The objectives are to describe the implementation strategies used, describe the resulting outcomes and to assess the effectiveness of the strategies. A systematic search was performed in six electronic databases (Medline, EMBASE, CINAHL, Web of Science, Informit, Cochrane Library) for relevant articles. Studies were included if they reported on clinical guidelines implementation strategies in the community pharmacy setting. Two researchers completed the full-search strategy, data abstraction and quality assessments, independently. A third researcher acted as a moderator. Quality assessments were completed with three validated tools. A narrative synthesis was performed to analyse results. A total of 1937 articles were retrieved and the titles and abstracts were screened. Full-text screening was completed for 36 articles resulting in 19 articles (reporting on 22 studies) included for review. Implementation strategies were categorised according to a modified version of the EPOC taxonomy. Educational interventions were the most commonly utilised strategy (n = 20), and computerised decision support systems demonstrated the greatest effect (n = 4). Most studies were multifaceted and used more than one implementation strategy (n = 18). Overall outcomes were moderately positive (n = 17) but focused on process (n = 22) rather than patient (n = 3) or economic outcomes (n = 3). Most studies (n = 20) were rated as being of low methodological quality and having low or very low quality of evidence for outcomes. Studies in this review did not generally have a well thought-out rationale for the choice of implementation strategy. Most utilised educational strategies, but the greatest effect on outcomes was demonstrated using computerised clinical decision support systems. Poor methodology, in the majority of the research, provided insufficient evidence to be conclusive about the best implementation strategies or the benefit of clinical guidelines in this setting. However, the generally positive outcomes across studies and strategies indicate that implementing clinical guidelines to community pharmacy might be beneficial. Improved methodological rigour in future research is required to strengthen the evidence for this hypothesis. PROSPERO 2012: CRD42012003019 .
Automatic Whistler Detector and Analyzer system: Implementation of the analyzer algorithm
NASA Astrophysics Data System (ADS)
Lichtenberger, JáNos; Ferencz, Csaba; Hamar, Daniel; Steinbach, Peter; Rodger, Craig J.; Clilverd, Mark A.; Collier, Andrew B.
2010-12-01
The full potential of whistlers for monitoring plasmaspheric electron density variations has not yet been realized. The primary reason is the vast human effort required for the analysis of whistler traces. Recently, the first part of a complete whistler analysis procedure was successfully automated, i.e., the automatic detection of whistler traces from the raw broadband VLF signal was achieved. This study describes a new algorithm developed to determine plasmaspheric electron density measurements from whistler traces, based on a Virtual (Whistler) Trace Transformation, using a 2-D fast Fourier transform transformation. This algorithm can be automated and can thus form the final step to complete an Automatic Whistler Detector and Analyzer (AWDA) system. In this second AWDA paper, the practical implementation of the Automatic Whistler Analyzer (AWA) algorithm is discussed and a feasible solution is presented. The practical implementation of the algorithm is able to track the variations of plasmasphere in quasi real time on a PC cluster with 100 CPU cores. The electron densities obtained by the AWA method can be used in investigations such as plasmasphere dynamics, ionosphere-plasmasphere coupling, or in space weather models.
High-efficiency aperiodic two-dimensional high-contrast-grating hologram
NASA Astrophysics Data System (ADS)
Qiao, Pengfei; Zhu, Li; Chang-Hasnain, Connie J.
2016-03-01
High efficiency phase holograms are designed and implemented using aperiodic two-dimensional (2D) high-contrast gratings (HCGs). With our design algorithm and an in-house developed rigorous coupled-wave analysis (RCWA) package for periodic 2D HCGs, the structural parameters are obtained to achieve a full 360-degree phase-tuning range of the reflected or transmitted wave, while maintaining the power efficiency above 90%. For given far-field patterns or 3D objects to reconstruct, we can generate the near-field phase distribution through an iterative process. The aperiodic HCG phase plates we design for holograms are pixelated, and the local geometric parameters for each pixel to achieve desired phase alternation are extracted from our periodic HCG designs. Our aperiodic HCG holograms are simulated using the 3D finite-difference time-domain method. The simulation results confirm that the desired far-field patterns are successfully produced under illumination at the designed wavelength. The HCG holograms are implemented on the quartz wafers, using amorphous silicon as the high-index material. We propose HCG designs at both visible and infrared wavelengths, and our simulation confirms the reconstruction of 3D objects. The high-contrast gratings allow us to realize low-cost, compact, flat, and integrable holograms with sub-micrometer thicknesses.
Multichannel FPGA based MVT system for high precision time (20 ps RMS) and charge measurement
NASA Astrophysics Data System (ADS)
Pałka, M.; Strzempek, P.; Korcyl, G.; Bednarski, T.; Niedźwiecki, Sz.; Białas, P.; Czerwiński, E.; Dulski, K.; Gajos, A.; Głowacz, B.; Gorgol, M.; Jasińska, B.; Kamińska, D.; Kajetanowicz, M.; Kowalski, P.; Kozik, T.; Krzemień, W.; Kubicz, E.; Mohhamed, M.; Raczyński, L.; Rudy, Z.; Rundel, O.; Salabura, P.; Sharma, N. G.; Silarski, M.; Smyrski, J.; Strzelecki, A.; Wieczorek, A.; Wiślicki, W.; Zieliński, M.; Zgardzińska, B.; Moskal, P.
2017-08-01
In this article it is presented an FPGA based Multi-Voltage Threshold (MVT) system which allows of sampling fast signals (1-2 ns rising and falling edge) in both voltage and time domain. It is possible to achieve a precision of time measurement of 20 ps RMS and reconstruct charge of signals, using a simple approach, with deviation from real value smaller than 10%. Utilization of the differential inputs of an FPGA chip as comparators together with an implementation of a TDC inside an FPGA allowed us to achieve a compact multi-channel system characterized by low power consumption and low production costs. This paper describes realization and functioning of the system comprising 192-channel TDC board and a four mezzanine cards which split incoming signals and discriminate them. The boards have been used to validate a newly developed Time-of-Flight Positron Emission Tomography system based on plastic scintillators. The achieved full system time resolution of σ(TOF) ≈ 68 ps is by factor of two better with respect to the current TOF-PET systems.
NASA Technical Reports Server (NTRS)
Miller, Christopher J.; Goodrick, Dan
2017-01-01
The problem of control command and maneuver induced structural loads is an important aspect of any control system design. The aircraft structure and the control architecture must be designed to achieve desired piloted control responses while limiting the imparted structural loads. The classical approach is to utilize high structural margins, restrict control surface commands to a limited set of analyzed combinations, and train pilots to follow procedural maneuvering limitations. With recent advances in structural sensing and the continued desire to improve safety and vehicle fuel efficiency, it is both possible and desirable to develop control architectures that enable lighter vehicle weights while maintaining and improving protection against structural damage. An optimal control technique has been explored and shown to achieve desirable vehicle control performance while limiting sensed structural loads to specified values. This technique has been implemented and flown on the National Aeronautics and Space Administration Full-scale Advanced Systems Testbed aircraft. The flight tests illustrate that the approach achieves the desired performance and show promising potential benefits. The flights also uncovered some important issues that will need to be addressed for production application.
Special Report. States Doubt Clean Air Achievement
ERIC Educational Resources Information Center
Environmental Science and Technology, 1974
1974-01-01
This special report reviews air quality control plans formulated by each state. Comparisons of these plans and discussions on the degree of implementation achieved by state governments are presented. Problems surrounding the establishment and implementation of EPA approved plans are discussed. (JP)
ERIC Educational Resources Information Center
Balfanz, Robert; Mac Iver, Douglas J.; Byrnes, Vaughan
2006-01-01
This article reports on the first 4 years of an effort to develop comprehensive and sustainable mathematics education reforms in high poverty middle schools. In four related analyses, we examine the levels of implementation achieved and impact of the reforms on various measures of achievement in the first 3 schools to implement the Talent…
1:1 iPad Implementation: A Study on Efficacy and Achievement in Reading
ERIC Educational Resources Information Center
Orman, John Paul
2017-01-01
The purpose of this study was to determine 1:1 iPad implementation's impact on students' reading achievement from 2013-2017 and student efficacy in using the technology to learn at one independent middle school in Grades 5-8. To determine the impact of the 1:1 iPad initiative on reading achievement, this mixed methods study examined five years of…
Accelerating Full Configuration Interaction Calculations for Nuclear Structure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Chao; Sternberg, Philip; Maris, Pieter
2008-04-14
One of the emerging computational approaches in nuclear physics is the full configuration interaction (FCI) method for solving the many-body nuclear Hamiltonian in a sufficiently large single-particle basis space to obtain exact answers - either directly or by extrapolation. The lowest eigenvalues and correspondingeigenvectors for very large, sparse and unstructured nuclear Hamiltonian matrices are obtained and used to evaluate additional experimental quantities. These matrices pose a significant challenge to the design and implementation of efficient and scalable algorithms for obtaining solutions on massively parallel computer systems. In this paper, we describe the computational strategies employed in a state-of-the-art FCI codemore » MFDn (Many Fermion Dynamics - nuclear) as well as techniques we recently developed to enhance the computational efficiency of MFDn. We will demonstrate the current capability of MFDn and report the latest performance improvement we have achieved. We will also outline our future research directions.« less
Sharma, A K; Guildal, T; Thomsen, H R; Jacobsen, B N
2011-01-01
The aim of this project was to investigate the potential of reducing number of mixers in the biological treatment process and thereby achieve energy and economical savings and contribute to cleaner environment. The project was carried out at Avedoere wastewater treatment plant and a full scale investigation was conducted to study the effect of reduced mixing on flow velocity, suspended solid sedimentation, concentration gradients of oxygen and SS with depth and treatment efficiency. The only negative effect observed was on flow velocity; however the velocity was above the critical velocity. The plant has been operating with 50% of its designed number of mixers since September 2007 and long term results also confirm that reduced mixing did not have any negative effect on treatment efficiency. The estimated yearly electricity saving is 0.75 GWh/year.
Re-engineering the process of medical imaging physics and technology education and training.
Sprawls, Perry
2005-09-01
The extensive availability of digital technology provides an opportunity for enhancing both the effectiveness and efficiency of virtually all functions in the process of medical imaging physics and technology education and training. This includes degree granting academic programs within institutions and a wide spectrum of continuing education lifelong learning activities. Full achievement of the advantages of technology-enhanced education (e-learning, etc.) requires an analysis of specific educational activities with respect to desired outcomes and learning objectives. This is followed by the development of strategies and resources that are based on established educational principles. The impact of contemporary technology comes from its ability to place learners into enriched learning environments. The full advantage of a re-engineered and implemented educational process involves changing attitudes and functions of learning facilitators (teachers) and resource allocation and sharing both within and among institutions.
Detector Development for the MARE Neutrino Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galeazzi, M.; Bogorin, D.; Molina, R.
2009-12-16
The MARE experiment is designed to measure the mass of the neutrino with sub-eV sensitivity by measuring the beta decay of {sup 187}Re with cryogenic microcalorimeters. A preliminary analysis shows that, to achieve the necessary statistics, between 10,000 and 50,000 detectors are likely necessary. We have fabricated and characterized Iridium transition edge sensors with high reproducibility and uniformity for such a large scale experiment. We have also started a full scale simulation of the experimental setup for MARE, including thermalization in the absorber, detector response, and optimum filter analysis, to understand the issues related to reaching a sub-eV sensitivity andmore » to optimize the design of the MARE experiment. We present our characterization of the Ir devices, including reproducibility, uniformity, and sensitivity, and we discuss the implementation and capabilities of our full scale simulation.« less
NASA Astrophysics Data System (ADS)
Han, Chulhee; Kim, Wan Ho; Choi, Seung-Bok
2016-04-01
This paper proposes a new type of a direct-drive valve (DDV) suspension system for vehicle controlled by the piezostack actuator associated with displacement amplifier. In order to achieve this goal, a new type of controllable piezostack DDV damper is designed and its performance evaluation of damping force is undertaken. Next, a full vehicle suspension system consisting of sprung mass, spring, tire and the piezostack DDV damper is constructed. After deriving the governing equations of the motion for the proposed the piezostack DDV suspension system, the skyhook controller is implemented for the realization of the full vehicle. Analytical model of the whole suspension system is then derived and performance characteristics are analyzed through numerical simulation. Finally, vibration control responses of the vehicle suspension system such as vertical acceleration are evaluated under both bump and sine road conditions.
Optical design of the lightning imager for MTG
NASA Astrophysics Data System (ADS)
Lorenzini, S.; Bardazzi, R.; Di Giampietro, M.; Feresin, F.; Taccola, M.; Cuevas, L. P.
2017-11-01
The Lightning Imager for Meteosat Third Generation is an optical payload with on-board data processing for the detection of lightning. The instrument will provide a global monitoring of lightning events over the full Earth disk from geostationary orbit and will operate in day and night conditions. The requirements of the large field of view together with the high detection efficiency with small and weak optical pulses superimposed to a much brighter and highly spatial and temporal variable background (full operation during day and night conditions, seasonal variations and different albedos between clouds oceans and lands) are driving the design of the optical instrument. The main challenge is to distinguish a true lightning from false events generated by random noise (e.g. background shot noise) or sun glints diffusion or signal variations originated by microvibrations. This can be achieved thanks to a `multi-dimensional' filtering, simultaneously working on the spectral, spatial and temporal domains. The spectral filtering is achieved with a very narrowband filter centred on the bright lightning O2 triplet line (777.4 nm +/- 0.17 nm). The spatial filtering is achieved with a ground sampling distance significantly smaller (between 4 and 5 km at sub satellite pointing) than the dimensions of a typical lightning pulse. The temporal filtering is achieved by sampling continuously the Earth disk within a period close to 1 ms. This paper presents the status of the optical design addressing the trade-off between different configurations and detailing the design and the analyses of the current baseline. Emphasis is given to the discussion of the design drivers and the solutions implemented in particular concerning the spectral filtering and the optimisation of the signal to noise ratio.
Investigate zero-stress replicated optics
NASA Technical Reports Server (NTRS)
Engelhaupt, Darell; Rood, Robert
1993-01-01
The contracted activities for the procurement of 'Investigate Zero-Stress Replicated Optics' to support the AXAF-S x-ray spectrometer mirrors has been completed. To date four large Wolter I grazing incidence x-ray optical shells have been electroformed from nickel. The mirrors were fabricated utilizing each of two nickel alloy plated aluminum substrates twice. A wide variety of testing has been completed by NASA MSFC and UAH. This testing includes heat treatment control tests, subscale plating and fixture testing, alloy control of the electroless nickel, adhesion and release testing of the gold to electroless nickel, electroforming instrumentation and software and fabrication of subscale models. The full scale shells are one millimeter thick nickel electrodeposited over a thin gold layer which in turn has the optical surface on the inside. The optical surface is the replicate of the surface prepared on the substrate. Appendix I briefly outlines the fabrication process. Major objectives which were shared by UAH and MSFC include the design of facilities, equipment and tooling and procurement of materials and equipment. Process development followed with the fabrication of small scale pilot units. Procurement commenced immediately and equipment and materials were ordered to implement the fabrication of first surface full scale substrates (mandrels) and the second surface electroformed optical components. All principal objectives have been achieved. Inspection of the mirrors in visible and x-ray modes validates that the required performance and the quality can be achieved by an electroforming replication process. A very distinct progressive improvement has been achieved with each of the four mirrors produced. The final mirror exceeded the original goals and set an improved standard for flight hardware. The future goal of a 30 arc second resolution at 8 KEV x-ray appears to be achievable by this process when proper cleanliness and process control is utilized.
Modeling health gains and cost savings for ten dietary salt reduction targets.
Wilson, Nick; Nghiem, Nhung; Eyles, Helen; Mhurchu, Cliona Ni; Shields, Emma; Cobiac, Linda J; Cleghorn, Christine L; Blakely, Tony
2016-04-26
Dietary salt reduction is included in the top five priority actions for non-communicable disease control internationally. We therefore aimed to identify health gain and cost impacts of achieving a national target for sodium reduction, along with component targets in different food groups. We used an established dietary sodium intervention model to study 10 interventions to achieve sodium reduction targets. The 2011 New Zealand (NZ) adult population (2.3 million aged 35+ years) was simulated over the remainder of their lifetime in a Markov model with a 3 % discount rate. Achieving an overall 35 % reduction in dietary salt intake via implementation of mandatory maximum levels of sodium in packaged foods along with reduced sodium from fast foods/restaurant food and discretionary intake (the "full target"), was estimated to gain 235,000 QALYs over the lifetime of the cohort (95 % uncertainty interval [UI]: 176,000 to 298,000). For specific target components the range was from 122,000 QALYs gained (for the packaged foods target) down to the snack foods target (6100 QALYs; and representing a 34-48 % sodium reduction in such products). All ten target interventions studied were cost-saving, with the greatest costs saved for the mandatory "full target" at NZ$1260 million (US$820 million). There were relatively greater health gains per adult for men and for Māori (indigenous population). This work provides modeling-level evidence that achieving dietary sodium reduction targets (including specific food category targets) could generate large health gains and cost savings for a national health sector. Demographic groups with the highest cardiovascular disease rates stand to gain most, assisting in reducing health inequalities between sex and ethnic groups.
Factors Associated With Full Implementation of Scope of Practice.
Ganz, Freda DeKeyser; Toren, Orly; Fadlon, Yafit
2016-05-01
To describe whether nurses fully implement their scope of practice; nurses' perceptions of future practice implementation; and the association between scope of practice implementation with professional autonomy and self-efficacy. A descriptive correlational study was conducted using a convenience sample of 145 registered nurses with post-basic certification from two Israeli university hospitals, from May 2012 to September 2013. Five questionnaires were distributed: (a) Demographic and Work Characteristics, (b) Implementation of Scope of Practice, (c) Attitudes Towards Future Practice, (d) Practice Behavior Scale, and (e) Practice Self-Efficacy. Descriptive statistics for all demographic and questionnaire data were analyzed. Two regression models were developed, where current and future implementations were the criterion variables and demographic and work characteristics, professional autonomy, and self-efficacy were the predictors. High levels of professional autonomy, self-efficacy, and attitudes towards future practice were found in contrast to low or moderate levels of current implementation of the full extent of scope of practice. Primary reasons associated with low implementation were lack of relevance to practice and permission to perform the practice. Significant associations were found between professional autonomy, self-efficacy, and attitudes towards future practice, but not with current implementation. Nurses wanted to practice to the full extent of their scope of practice and felt able to do so but were hindered by administrative and not personal barriers. Even though staff nurses with post-basic certification had high levels of professional autonomy and self-efficacy, many were not implementing the full extent of their scope of practice. Similar to findings from around the world, external factors, such as administrative and policy barriers, were found to thwart the full implementation of nurses' full scope of practice. Therefore, practicing nurses should be aware of these barriers and work towards reducing them. © 2016 Sigma Theta Tau International.
Co-simulation of a complete rectenna with a circular slot loop antenna in CPW technology
NASA Astrophysics Data System (ADS)
Rivière, Jérôme; Douyère, Alexandre; Cazour, Jonathan; Alicalapa, Frédéric; Luk, Jean-Daniel Lan Sun
2017-05-01
This study starts with the design of a planar and compact CPW antenna fabricated on Arlon AD1000 substrate, ɛr=10.35. The antenna is a coplanar waveguide (CPW) fed circular slot loop antenna matched to the standard impedance 50 Ω by two stubs. The goal is to implement this antenna with a CPW RF/DC rectifier to build an optimized low power level rectenna. The rectenna design is restricted to allow easy and fast fabrication of an array with a high reproducibility. The full rectenna is simulated and achieves 10% effciency at -20 dBm.
Protecting global soil resources for future generations
NASA Astrophysics Data System (ADS)
Montanarella, Luca
2017-04-01
The latest Status of World's Soil Resources report has highlighted that soils are increasingly under pressure by numerous human induced degradation processes in most parts of the world. The limits of our planetary boundaries concerning vital soil resources have been reached and without reversing this negative trend there will be a serious lack of necessary soil resources for future generations. It has been therefore of the highest importance to include soils within some of the Sustainable Development Goals (SDG) recently approved by the United Nations. Sustainable development can not be achieved without protecting the limited, non-renewable, soil resources of our planet. There is the need to limit on-going soil degradation processes and to implement extensive soil restoration activities in order to strive towards a land degradation neutral (LDN) world, as called upon by SDG 15. Sustainable soil management needs to be placed at the core of any LDN strategy and therefore it is of highest importance that the recently approved Voluntary Guidelines for Sustainable Soil Management (VGSSM) of FAO get fully implemented at National and local scale.Sustainable soil management is not only relevant for the protection of fertile soils for food production, but also to mitigate and adopt to climate change at to preserve the large soil biodiversity pool. Therefore the VGSSM are not only relevant to FAO, but also the the climate change convention (UNFCCC) and the biodiversity convention (CBD). An integrated assessment of the current land degradation processes and the available land restoration practices is needed in order to fully evaluate the potential for effectively achieving LDN by 2030. The on-going Land Degradation and Restoration Assessment (LDRA) of the Intergovernmental Platform for Biodiversity and Ecosystem Services (IPBES) will provide the necessary scientific basis for the full implementation of the necessary measures for achieving the planned SGS's relevant to land and soils by 2030.
[Implementation of precision control to achieve the goal of schistosomiasis elimination in China].
Zhou, Xiao-nong
2016-02-01
The integrated strategy for schistosomiasis control with focus on infectious source control, which has been implemented since 2004, accelerated the progress towards schistosomiasis control in China, and achieved transmission control of the disease across the country by the end of 2015, which achieved the overall objective of the Mid- and Long-term National Plan for Prevention and Control of Schistosomiasis (2004-2015) on schedule. Then, the goal of schistosomiasis elimination by 2025 was proposed in China in 2014. To achieve this new goal on schedule, we have to address the key issues, and implement precision control measures with more precise identification of control targets, so that we are able to completely eradicate the potential factors leading to resurgence of schistosomiasis transmission and enable the achievement of schistosomiasis elimination on schedule. Precision schistosomiasis control, a theoretical innovation of precision medicine in schistosomiasis control, will provide new insights into schistosomiasis control based on the conception of precision medicine. This paper describes the definition, interventions and the role of precision schistosomiasis control in the elimination of schistosomiasis in China, and demonstrates that sustainable improvement of professionals and integrated control capability at grass-root level is a prerequisite to the implementation of schistosomiasis control, precision schistosomiasis control is a key to the further implementation of the integrated strategy for schistosomiasis control with focus on infectious source control, and precision schistosomiasis control is a guarantee of curing schistosomiasis patients and implementing schistosomiasis control program and interventions.
Hazelton, Patrick T.; Steward, Wayne T.; Collins, Shane P.; Gaffney, Stuart; Morin, Stephen F.; Arnold, Emily A.
2014-01-01
Background In preparation for full Affordable Care Act implementation, California has instituted two healthcare initiatives that provide comprehensive coverage for previously uninsured or underinsured individuals. For many people living with HIV, this has required transition either from the HIV-specific coverage of the Ryan White program to the more comprehensive coverage provided by the county-run Low-Income Health Programs or from Medicaid fee-for-service to Medicaid managed care. Patient advocates have expressed concern that these transitions may present implementation challenges that will need to be addressed if ambitious HIV prevention and treatment goals are to be achieved. Methods 30 semi-structured, in-depth interviews were conducted between October, 2012, and February, 2013, with policymakers and providers in 10 urban, suburban, and rural California counties. Interview topics included: continuity of patient care, capacity to handle payer source transitions, and preparations for healthcare reform implementation. Study team members reviewed interview transcripts to produce emergent themes, develop a codebook, build inter-rater reliability, and conduct analyses. Results Respondents supported the goals of the ACA, but reported clinic and policy-level challenges to maintaining patient continuity of care during the payer source transitions. They also identified strategies for addressing these challenges. Areas of focus included: gaps in communication to reach patients and develop partnerships between providers and policymakers, perceived inadequacy in new provider networks for delivering quality HIV care, the potential for clinics to become financially insolvent due to lower reimbursement rates, and increased administrative burdens for clinic staff and patients. Conclusions California's new healthcare initiatives represent ambitious attempts to expand and improve health coverage for low-income individuals. The state's challenges in maintaining quality care and treatment for people living with HIV experiencing these transitions demonstrate the importance of setting effective policies in anticipation of full ACA implementation in 2014. PMID:24599337
Hazelton, Patrick T; Steward, Wayne T; Collins, Shane P; Gaffney, Stuart; Morin, Stephen F; Arnold, Emily A
2014-01-01
In preparation for full Affordable Care Act implementation, California has instituted two healthcare initiatives that provide comprehensive coverage for previously uninsured or underinsured individuals. For many people living with HIV, this has required transition either from the HIV-specific coverage of the Ryan White program to the more comprehensive coverage provided by the county-run Low-Income Health Programs or from Medicaid fee-for-service to Medicaid managed care. Patient advocates have expressed concern that these transitions may present implementation challenges that will need to be addressed if ambitious HIV prevention and treatment goals are to be achieved. 30 semi-structured, in-depth interviews were conducted between October, 2012, and February, 2013, with policymakers and providers in 10 urban, suburban, and rural California counties. Interview topics included: continuity of patient care, capacity to handle payer source transitions, and preparations for healthcare reform implementation. Study team members reviewed interview transcripts to produce emergent themes, develop a codebook, build inter-rater reliability, and conduct analyses. Respondents supported the goals of the ACA, but reported clinic and policy-level challenges to maintaining patient continuity of care during the payer source transitions. They also identified strategies for addressing these challenges. Areas of focus included: gaps in communication to reach patients and develop partnerships between providers and policymakers, perceived inadequacy in new provider networks for delivering quality HIV care, the potential for clinics to become financially insolvent due to lower reimbursement rates, and increased administrative burdens for clinic staff and patients. California's new healthcare initiatives represent ambitious attempts to expand and improve health coverage for low-income individuals. The state's challenges in maintaining quality care and treatment for people living with HIV experiencing these transitions demonstrate the importance of setting effective policies in anticipation of full ACA implementation in 2014.
Franklin, Mariza Ramalho; Fernandes, Horst Monken
2013-05-01
Environmental remediation of radioactive contamination is about achieving appropriate reduction of exposures to ionizing radiation. This goal can be achieved by means of isolation or removal of the contamination source(s) or by breaking the exposure pathways. Ideally, environmental remediation is part of the planning phase of any industrial operation with the potential to cause environmental contamination. This concept is even more important in mining operations due to the significant impacts produced. This approach has not been considered in several operations developed in the past. Therefore many legacy sites face the challenge to implement appropriate remediation plans. One of the first barriers to remediation works is the lack of financial resources as environmental issues used to be taken in the past as marginal costs and were not included in the overall budget of the company. This paper analyses the situation of the former uranium production site of Poços de Caldas in Brazil. It is demonstrated that in addition to the lack of resources, other barriers such as the lack of information on site characteristics, appropriate regulatory framework, funding mechanisms, stakeholder involvement, policy and strategy, technical experience and mechanism for the appropriation of adequate technical expertise will play key roles in preventing the implementation of remediation programs. All these barriers are discussed and some solutions are suggested. It is expected that lessons learned from the Poços de Caldas legacy site may stimulate advancement of more sustainable options in the development of future uranium production centers. Copyright © 2011 Elsevier Ltd. All rights reserved.
PRAXIS: a near infrared spectrograph optimised for OH suppression
NASA Astrophysics Data System (ADS)
Ellis, S. C.; Bauer, S.; Bland-Hawthorn, J.; Case, S.; Content, R.; Fechner, T.; Giannone, D.; Haynes, R.; Hernandez, E.; Horton, A. J.; Klauser, U.; Lawrence, J. S.; Leon-Saval, S. G.; Lindley, E.; Löhmannsröben, H.-G.; Min, S.-S.; Pai, N.; Roth, M.; Shortridge, K.; Staszak, Nicholas F.; Tims, Julia; Xavier, Pascal; Zhelem, Ross
2016-08-01
Atmospheric emission from OH molecules is a long standing problem for near-infrared astronomy. PRAXIS is a unique spectrograph, currently in the build-phase, which is fed by a fibre array that removes the OH background. The OH suppression is achieved with fibre Bragg gratings, which were tested successfully on the GNOSIS instrument. PRAXIS will use the same fibre Bragg gratings as GNOSIS in the first implementation, and new, less expensive and more efficient, multicore fibre Bragg gratings in the second implementation. The OH lines are suppressed by a factor of 1000, and the expected increase in the signal-to-noise in the interline regions compared to GNOSIS is a factor of 9 with the GNOSIS gratings and a factor of 17 with the new gratings. PRAXIS will enable the full exploitation of OH suppression for the first time, which was not achieved by GNOSIS due to high thermal emission, low spectrograph transmission, and detector noise. PRAXIS will have extremely low thermal emission, through the cooling of all significantly emitting parts, including the fore-optics, the fibre Bragg gratings, a long length of fibre, and a fibre slit, and an optical design that minimises leaks of thermal emission from outside the spectrograph. PRAXIS will achieve low detector noise through the use of a Hawaii-2RG detector, and a high throughput through an efficient VPH based spectrograph. The scientific aims of the instrument are to determine the absolute level of the interline continuum and to enable observations of individual objects via an IFU. PRAXIS will first be installed on the AAT, then later on an 8m class telescope.
The National Shipbuilding Research Program. Shipyard MACT Implementation Plan and Compliance Tools
1996-06-01
display a currently valid OMB control number. 1. REPORT DATE JUN 1996 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE The National...ACHIEVABLE CONTROL TECHNOLOGY SECTION TWO: MODEL SHIPYARD IMPLEMENTATION PLAN SECTION THREE: THINNING RATION CALCULATION SHEETS FOR OPTIONS 2 & 3 AND...INTERPRETATION OF THE SHIPYARD MAXIMUM ACHIEVABLE CONTROL TECHNOLOGY EPA’s Maximum Achievable Control Technology Rule for Shipyards: A Plain English
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-26
...] Guidance for Industry: Implementation of an Acceptable Full- Length and Abbreviated Donor History... Full-Length and Abbreviated Donor History Questionnaires and Accompanying Materials for Use in... full-length and abbreviated donor history questionnaires and accompanying materials, version 1.2 dated...
Is searching full text more effective than searching abstracts?
Lin, Jimmy
2009-01-01
Background With the growing availability of full-text articles online, scientists and other consumers of the life sciences literature now have the ability to go beyond searching bibliographic records (title, abstract, metadata) to directly access full-text content. Motivated by this emerging trend, I posed the following question: is searching full text more effective than searching abstracts? This question is answered by comparing text retrieval algorithms on MEDLINE® abstracts, full-text articles, and spans (paragraphs) within full-text articles using data from the TREC 2007 genomics track evaluation. Two retrieval models are examined: bm25 and the ranking algorithm implemented in the open-source Lucene search engine. Results Experiments show that treating an entire article as an indexing unit does not consistently yield higher effectiveness compared to abstract-only search. However, retrieval based on spans, or paragraphs-sized segments of full-text articles, consistently outperforms abstract-only search. Results suggest that highest overall effectiveness may be achieved by combining evidence from spans and full articles. Conclusion Users searching full text are more likely to find relevant articles than searching only abstracts. This finding affirms the value of full text collections for text retrieval and provides a starting point for future work in exploring algorithms that take advantage of rapidly-growing digital archives. Experimental results also highlight the need to develop distributed text retrieval algorithms, since full-text articles are significantly longer than abstracts and may require the computational resources of multiple machines in a cluster. The MapReduce programming model provides a convenient framework for organizing such computations. PMID:19192280
Full resolution hologram-like autostereoscopic display
NASA Technical Reports Server (NTRS)
Eichenlaub, Jesse B.; Hutchins, Jamie
1995-01-01
Under this program, Dimension Technologies Inc. (DTI) developed a prototype display that uses a proprietary illumination technique to create autostereoscopic hologram-like full resolution images on an LCD operating at 180 fps. The resulting 3D image possesses a resolution equal to that of the LCD along with properties normally associated with holograms, including change of perspective with observer position and lack of viewing position restrictions. Furthermore, this autostereoscopic technique eliminates the need to wear special glasses to achieve the parallax effect. Under the program a prototype display was developed which demonstrates the hologram-like full resolution concept. To implement such a system, DTI explored various concept designs and enabling technologies required to support those designs. Specifically required were: a parallax illumination system with sufficient brightness and control; an LCD with rapid address and pixel response; and an interface to an image generation system for creation of computer graphics. Of the possible parallax illumination system designs, we chose a design which utilizes an array of fluorescent lamps. This system creates six sets of illumination areas to be imaged behind an LCD. This controlled illumination array is interfaced to a lenticular lens assembly which images the light segments into thin vertical light lines to achieve the parallax effect. This light line formation is the foundation of DTI's autostereoscopic technique. The David Sarnoff Research Center (Sarnoff) was subcontracted to develop an LCD that would operate with a fast scan rate and pixel response. Sarnoff chose a surface mode cell technique and produced the world's first large area pi-cell active matrix TFT LCD. The device provided adequate performance to evaluate five different perspective stereo viewing zones. A Silicon Graphics' Iris Indigo system was used for image generation which allowed for static and dynamic multiple perspective image rendering. During the development of the prototype display, we identified many critical issues associated with implementing such a technology. Testing and evaluation enabled us to prove that this illumination technique provides autostereoscopic 3D multi perspective images with a wide range of view, smooth transition, and flickerless operation given suitable enabling technologies.
Implementation of a Health Promotion Programme: A Ten-Year Retrospective Study
ERIC Educational Resources Information Center
Darlington, Emily Joan; Simar, Carine; Jourdan, Didier
2017-01-01
Purpose: Implementing health promotion programmes in schools is key to improving children's health and well-being but difficulties in achieving expected results are often reported in the research literature. Discrepancies between expected and achieved outcomes can originate from differences in contexts. Understanding how interactions between…
Black, Robert E; Taylor, Carl E; Arole, Shobha; Bang, Abhay; Bhutta, Zulfiqar A; Chowdhury, A Mushtaque R; Kirkwood, Betty R; Kureshy, Nazo; Lanata, Claudio F; Phillips, James F; Taylor, Mary; Victora, Cesar G; Zhu, Zonghan; Perry, Henry B
2017-06-01
The contributions that community-based primary health care (CBPHC) and engaging with communities as valued partners can make to the improvement of maternal, neonatal and child health (MNCH) is not widely appreciated. This unfortunate reality is one of the reasons why so few priority countries failed to achieve the health-related Millennium Development Goals by 2015. This article provides a summary of a series of articles about the effectiveness of CBPHC in improving MNCH and offers recommendations from an Expert Panel for strengthening CBPHC that were formulated in 2008 and have been updated on the basis of more recent evidence. An Expert Panel convened to guide the review of the effectiveness of community-based primary health care (CBPHC). The Expert Panel met in 2008 in New York City with senior UNICEF staff. In 2016, following the completion of the review, the Panel considered the review's findings and made recommendations. The review consisted of an analysis of 661 unique reports, including 583 peer-reviewed journal articles, 12 books/monographs, 4 book chapters, and 72 reports from the gray literature. The analysis consisted of 700 assessments since 39 were analyzed twice (once for an assessment of improvements in neonatal and/or child health and once for an assessment in maternal health). The Expert Panel recommends that CBPHC should be a priority for strengthening health systems, accelerating progress in achieving universal health coverage, and ending preventable child and maternal deaths. The Panel also recommends that expenditures for CBPHC be monitored against expenditures for primary health care facilities and hospitals and reflect the importance of CBPHC for averting mortality. Governments, government health programs, and NGOs should develop health systems that respect and value communities as full partners and work collaboratively with them in building and strengthening CBPHC programs - through engagement with planning, implementation (including the full use of community-level workers), and evaluation. CBPHC programs need to reach every community and household in order to achieve universal coverage of key evidence-based interventions that can be implemented in the community outside of health facilities and assure that those most in need are reached. Stronger CBPHC programs that foster community engagement/empowerment with the implementation of evidence-based interventions will be essential for achieving universal coverage of health services by 2030 (as called for by the Sustainable Development Goals recently adopted by the United Nations), ending preventable child and maternal deaths by 2030 (as called for by the World Health Organization, UNICEF, and many countries around the world), and eventually achieving Health for All as envisioned at the International Conference on Primary Health Care in 1978. Stronger CBPHC programs can also create entry points and synergies for expanding the coverage of family planning services as well as for accelerating progress in the detection and treatment of HIV/AIDS, tuberculosis, malaria, hypertension, and other chronic diseases. Continued strengthening of CBPHC programs based on rigorous ongoing operations research and evaluation will be required, and this evidence will be needed to guide national and international policies and programs.
Jarrett, P Gary
2006-01-01
The primary purpose of this study is to undertake a diagnostic investigation of the international health care logistical environment and determine whether regulatory policies or industry procedures have hindered the implementation of just-in-time (JIT) systems and then to recommend operational improvements to be achieved by implementing JIT Systems. The analysis was conducted in a systematic manner and compared the anticipated benefits with benefits validated in other industries from the implementation of JIT. An extensive literature review was conducted. In this particular study the cost and benefit outcomes achieved from a health care JIT implementation were compared with those achieved by the manufacturing, service, and retail industries. Chiefly, it was found that the health service market must be restructured to encourage greater price competition among priorities. A new standardization process should eliminate duplication of products and realize substantial savings. The analysis was conducted in a systematic manner and compared the anticipated benefits with benefits validated in other industries from the implementation of JIT.
Large-scale seismic waveform quality metric calculation using Hadoop
NASA Astrophysics Data System (ADS)
Magana-Zook, S.; Gaylord, J. M.; Knapp, D. R.; Dodge, D. A.; Ruppert, S. D.
2016-09-01
In this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of 0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/O performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. These experiments were conducted multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will likely require significant changes in other parts of our infrastructure. Nevertheless, we anticipate that as the technology matures and third-party tool vendors make it easier to manage and operate clusters, Hadoop (or a successor) will play a large role in our seismic data processing.
Force and Moment Approach for Achievable Dynamics Using Nonlinear Dynamic Inversion
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.; Bacon, Barton J.
1999-01-01
This paper describes a general form of nonlinear dynamic inversion control for use in a generic nonlinear simulation to evaluate candidate augmented aircraft dynamics. The implementation is specifically tailored to the task of quickly assessing an aircraft's control power requirements and defining the achievable dynamic set. The achievable set is evaluated while undergoing complex mission maneuvers, and perfect tracking will be accomplished when the desired dynamics are achievable. Variables are extracted directly from the simulation model each iteration, so robustness is not an issue. Included in this paper is a description of the implementation of the forces and moments from simulation variables, the calculation of control effectiveness coefficients, methods for implementing different types of aerodynamic and thrust vectoring controls, adjustments for control effector failures, and the allocation approach used. A few examples illustrate the perfect tracking results obtained.
NASA Astrophysics Data System (ADS)
Rusijono; Khotimah, K.
2018-01-01
The purpose of this research was to investigate the effect of implementing the assessment model based on character building to improve discipline and student’s achievement. Assessment model based on character building includes three components, which are the behaviour of students, the efforts, and student’s achievement. This assessment model based on the character building is implemented in science philosophy and educational assessment courses, in Graduate Program of Educational Technology Department, Educational Faculty, Universitas Negeri Surabaya. This research used control group pre-test and post-test design. Data collection method used in this research were observation and test. The observation was used to collect the data about the disciplines of the student in the instructional process, while the test was used to collect the data about student’s achievement. Moreover, the study applied t-test to the analysis of data. The result of this research showed that assessment model based on character building improved discipline and student’s achievement.
ERIC Educational Resources Information Center
Reid, Sandy D.
2010-01-01
The purpose of this study was to determine (a) if the reading program adopted by Sally D. Meadows enhanced the achievement of students placed in the Early Intervention Program (EIP); (b) if the students' reading achievement scores increased more after the second year of implementation than they did after the first year of implementation; and (c)…
An Orbit And Dispersion Correction Scheme for the PEP II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Y.; Donald, M.; Shoaee, H.
2011-09-01
To achieve optimum luminosity in a storage ring it is vital to control the residual vertical dispersion. In the original PEP storage ring, a scheme to control the residual dispersion function was implemented using the ring orbit as the controlling element. The 'best' orbit not necessarily giving the lowest vertical dispersion. A similar scheme has been implemented in both the on-line control code and in the simulation code LEGO. The method involves finding the response matrices (sensitivity of orbit/dispersion at each Beam-Position-Monitor (BPM) to each orbit corrector) and solving in a least squares sense for minimum orbit, dispersion function ormore » both. The optimum solution is usually a subset of the full least squares solution. A scheme of simultaneously correcting the orbits and dispersion has been implemented in the simulation code and on-line control system for PEP-II. The scheme is based on the eigenvector decomposition method. An important ingredient of the scheme is to choose the optimum eigenvectors that minimize the orbit, dispersion and corrector strength. Simulations indicate this to be a very effective way to control the vertical residual dispersion.« less
Solving Nonlinear Euler Equations with Arbitrary Accuracy
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.
2005-01-01
A computer program that efficiently solves the time-dependent, nonlinear Euler equations in two dimensions to an arbitrarily high order of accuracy has been developed. The program implements a modified form of a prior arbitrary- accuracy simulation algorithm that is a member of the class of algorithms known in the art as modified expansion solution approximation (MESA) schemes. Whereas millions of lines of code were needed to implement the prior MESA algorithm, it is possible to implement the present MESA algorithm by use of one or a few pages of Fortran code, the exact amount depending on the specific application. The ability to solve the Euler equations to arbitrarily high accuracy is especially beneficial in simulations of aeroacoustic effects in settings in which fully nonlinear behavior is expected - for example, at stagnation points of fan blades, where linearizing assumptions break down. At these locations, it is necessary to solve the full nonlinear Euler equations, and inasmuch as the acoustical energy is of the order of 4 to 5 orders of magnitude below that of the mean flow, it is necessary to achieve an overall fractional error of less than 10-6 in order to faithfully simulate entropy, vortical, and acoustical waves.
Implementation of the P barANDA Planar-GEM tracking detector in Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Divani Veis, Nazila; Ehret, Andre; Firoozabadi, Mohammad M.; Karabowicz, Radoslaw; Maas, Frank; Saito, Nami; Saito, Takehiko R.; Voss, Bernd; PANDA Gem-Tracker Subgroup
2018-02-01
The P barANDA experiment at FAIR will be performed to investigate different aspects of hadron physics using anti-proton beams interacting with a fixed nuclear target. The experimental setup consists of a complex series of detector components covering a large solid angle. A detector with a gaseous active media equipped with gas electron multiplier (GEM) technique will be employed to measure tracks of charged particles at forward direction in order to achieve a high momentum resolution. In this work, a full setup of the GEM tracking detector has been implemented in the P barANDA Monte Carlo simulation package (PandaRoot) based on the current technical and conceptual design, and the expected performance of the P barANDA GEM-tracking detector has been investigated. Furthermore, material-budget studies in terms of the radiation length of the P barANDA GEM-tracking detector have been made in order to investigate the effect of the detector materials and its associated structures to particle measurements.
2011-09-01
The transfer of new technologies (e.g., evidence-based practices) into substance abuse treatment organizations often occurs long after they have been developed and shown to be effective. Transfer is slowed, in part, due to a lack of clear understanding about all that is needed to achieve full implementation of these technologies. Such misunderstanding is exacerbated by inconsistent terminology and overlapping models of an innovation, including its development and validation, dissemination to the public, and implementation or use in the field. For this reason, a workgroup of the Addiction Technology Transfer Center (ATTC) Network developed a field-driven conceptual model of the innovation process that more precisely defines relevant terms and concepts and integrates them into a comprehensive taxonomy. The proposed definitions and conceptual framework will allow for improved understanding and consensus regarding the distinct meaning and conceptual relationships between dimensions of the technology transfer process and accelerate the use of evidence-based practices. Copyright © 2011 Elsevier Inc. All rights reserved.
Memristive Ion Channel-Doped Biomembranes as Synaptic Mimics.
Najem, Joseph S; Taylor, Graham J; Weiss, Ryan J; Hasan, Md Sakib; Rose, Garrett; Schuman, Catherine D; Belianinov, Alex; Collier, C Patrick; Sarles, Stephen A
2018-05-22
Solid-state neuromorphic systems based on transistors or memristors have yet to achieve the interconnectivity, performance, and energy efficiency of the brain due to excessive noise, undesirable material properties, and nonbiological switching mechanisms. Here we demonstrate that an alamethicin-doped, synthetic biomembrane exhibits memristive behavior, emulates key synaptic functions including paired-pulse facilitation and depression, and enables learning and computing. Unlike state-of-the-art devices, our two-terminal, biomolecular memristor features similar structure (biomembrane), switching mechanism (ion channels), and ionic transport modality as biological synapses while operating at considerably lower power. The reversible and volatile voltage-driven insertion of alamethicin peptides into an insulating lipid bilayer creates conductive pathways that exhibit pinched current-voltage hysteresis at potentials above their insertion threshold. Moreover, the synapse-like dynamic properties of the biomolecular memristor allow for simplified learning circuit implementations. Low-power memristive devices based on stimuli-responsive biomolecules represent a major advance toward implementation of full synaptic functionality in neuromorphic hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
N /A
This Final ''Hanford Comprehensive Land-Use Plan Environmental Impact Statement'' (HCP EIS) is being used by the Department of Energy (DOE) and its nine cooperating and consulting agencies to develop a comprehensive land-use plan (CLUP) for the Hanford Site. The DOE will use the Final HCP EIS as a basis for a Record of Decision (ROD) on a CLUP for the Hanford Site. While development of the CLUP will be complete with release of the HCP EIS ROD, full implementation of the CLUP is expected to take at least 50 years. Implementation of the CLUP would begin a more detailed planningmore » process for land-use and facility-use decisions at the Hanford Site. The DOE would use the CLUP to screen proposals. Eventually, management of Hanford Site areas would move toward the CLUP land-use goals. This CLUP process could take more than 50 years to fully achieve the land-use goals.« less
Reference drug programs: effectiveness and policy implications.
Schneeweiss, Sebastian
2007-04-01
In the current economic environment, health care systems are constantly struggling to contain rapidly rising costs. Drug costs are targeted by a wide variety of measures. Many jurisdictions have implemented reference drug programs (RDPs) or similar therapeutic substitution programs. This paper summarizes the mechanism and rationale of RDPs and presents evidence of their economic effectiveness and clinical safety. RDPs for pharmaceutical reimbursement are based on the assumption that drugs within specified medication groups are therapeutically equivalent and clinically interchangeable and that a common reimbursement level can thus be established. If the evidence documents that a higher price for a given drug does not buy greater effectiveness or reduced toxicity, then under RDP such extra costs are not covered. RDPs or therapeutic substitutions based on therapeutic equivalence are seen as logical extensions of generic substitution that is based on bioequivalence of drugs. If the goal is to achieve full drug coverage for as many patients as possible in the most efficient manner, then RDPs in combination with prior authorization programs are safer and more effective than simplistic fiscal drug policies, including fixed co-payments, co-insurances, or deductibles. RDPs will reduce spending in the less innovative but largest market, while fully covering all patients. Prior authorization will ensure that patients with a specified indication will benefit from the most innovative therapies with full coverage. In practice, however, not all patients and drugs will fit exactly into one of the two categories. Therefore, a process of medically indicated exemptions that will consider full coverage should accompany an RDP. In the current economic environment, health care systems are constantly struggling to contain rapidly rising costs. Drug costs are targeted by a wide variety of measures. Many jurisdictions have implemented reference drug programs, and others are considering them. This paper summarizes the mechanism and rationale of RDPs, presents evidence of their economic effectiveness and clinical safety, and concludes with some practical implications of implementing RDP policies.
Reference drug programs: Effectiveness and policy implications☆
Schneeweiss, Sebastian
2010-01-01
In the current economic environment, health care systems are constantly struggling to contain rapidly rising costs. Drug costs are targeted by a wide variety of measures. Many jurisdictions have implemented reference drug programs (RDPs) or similar therapeutic substitution programs. This paper summarizes the mechanism and rationale of RDPs and presents evidence of their economic effectiveness and clinical safety. RDPs for pharmaceutical reimbursement are based on the assumption that drugs within specified medication groups are therapeutically equivalent and clinically interchangeable and that a common reimbursement level can thus be established. If the evidence documents that a higher price for a given drug does not buy greater effectiveness or reduced toxicity, then under RDP such extra costs are not covered. RDPs or therapeutic substitutions based on therapeutic equivalence are seen as logical extensions of generic substitution that is based on bioequivalence of drugs. If the goal is to achieve full drug coverage for as many patients as possible in the most efficient manner, then RDPs in combination with prior authorization programs are safer and more effective than simplistic fiscal drug policies, including fixed co-payments, co-insurances, or deductibles. RDPs will reduce spending in the less innovative but largest market, while fully covering all patients. Prior authorization will ensure that patients with a specified indication will benefit from the most innovative therapies with full coverage. In practice, however, not all patients and drugs will fit exactly into one of the two categories. Therefore, a process of medically indicated exemptions that will consider full coverage should accompany an RDP. In the current economic environment, health care systems are constantly struggling to contain rapidly rising costs. Drug costs are targeted by a wide variety of measures. Many jurisdictions have implemented reference drug programs, and others are considering them. This paper summarizes the mechanism and rationale of RDPs, presents evidence of their economic effectiveness and clinical safety, and concludes with some practical implications of implementing RDP policies. PMID:16777256
Osimani, Andrea; Aquilanti, Lucia; Tavoletti, Stefano; Clementi, Francesca
2013-01-01
Food safety is essential in mass catering. In Europe, Regulation (EC) No. 852/2004 requires food business operators to put in place, implement and maintain permanent procedures based on Hazard Analysis and Critical Control Point (HACCP) principles. Each HACCP plan is specifically implemented for the processing plant and processing methods and requires a systematic collection of data on the incidence, elimination, prevention, and reduction of risks. In this five-year-study, the effectiveness of the HACCP plan of a University canteen was verified through periodic internal auditing and microbiological monitoring of meals, small equipment, cooking tools, working surfaces, as well as hands and white coats of the canteen staff. The data obtained revealed no safety risks for the consumers, since Escherichia coli, Salmonella spp. and Listeria monocytogenes were never detected; however, a quite discontinuous microbiological quality of meals was revealed. The fluctuations in the microbial loads of mesophilic aerobes, coliforms, Staphylococcus aureus, Bacillus cereus, and sulphite-reducing clostridia were mainly ascribed to inadequate handling or processing procedures, thus suggesting the need for an enhancement of staff training activities and for a reorganization of tasks. Due to the wide variety of the fields covered by internal auditing, the full conformance to all the requirements was never achieved, though high scores, determined by assigning one point to each answer which matched with the requirements, were achieved in all the years. PMID:23594937
Osimani, Andrea; Aquilanti, Lucia; Tavoletti, Stefano; Clementi, Francesca
2013-04-17
Food safety is essential in mass catering. In Europe, Regulation (EC) No. 852/2004 requires food business operators to put in place, implement and maintain permanent procedures based on Hazard Analysis and Critical Control Point (HACCP) principles. Each HACCP plan is specifically implemented for the processing plant and processing methods and requires a systematic collection of data on the incidence, elimination, prevention, and reduction of risks. In this five-year-study, the effectiveness of the HACCP plan of a University canteen was verified through periodic internal auditing and microbiological monitoring of meals, small equipment, cooking tools, working surfaces, as well as hands and white coats of the canteen staff. The data obtained revealed no safety risks for the consumers, since Escherichia coli, Salmonella spp. and Listeria monocytogenes were never detected; however, a quite discontinuous microbiological quality of meals was revealed. The fluctuations in the microbial loads of mesophilic aerobes, coliforms, Staphylococcus aureus, Bacillus cereus, and sulphite-reducing clostridia were mainly ascribed to inadequate handling or processing procedures, thus suggesting the need for an enhancement of staff training activities and for a reorganization of tasks. Due to the wide variety of the fields covered by internal auditing, the full conformance to all the requirements was never achieved, though high scores, determined by assigning one point to each answer which matched with the requirements, were achieved in all the years.
The Impact of Principal Training in Diffusion of Innovation Theory on Fidelity of Implementation
ERIC Educational Resources Information Center
Petruzzelli, Anthony
2010-01-01
Districts and schools are constantly trying to find ways to increase student achievement. Research has shown a significant correlation between principal leadership skills and increased student achievement. Research has also shown a correlation between fidelity of implementation of new innovations and positive outcomes. This purpose of this study…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-25
... proposed post-control BART limit of 0.012 lb/MMBtu on Units 1-3. C. Modeling and Demonstrating Reasonable... a different alternative emissions control strategy would achieve more progress than EPA's BART... Background for Proposing To Approve an Alternative Emissions Control Strategy as Achieving Better Progress...
The Students in Front of Us: Reform for the Current Generation of Urban High School Students
ERIC Educational Resources Information Center
Burks, Joe; Hochbein, Craig
2015-01-01
The implementation of education policies requiring the turnaround of persistently low-achieving schools has demanded reforms that will not only improve achievement, but also deliver results in a short period of time. To meet such demands, Jefferson County Public Schools educators implemented Project Proficiency (PP). Results from…
Progress in the First Five Years: An Evaluation of Achieving the Dream Colleges in Washington State
ERIC Educational Resources Information Center
Jenkins, Davis; Wachen, John; Kerrigan, Monica Reid; Mayer, Alexander K.
2012-01-01
In 2006, six community and technical colleges in Washington State joined the innovative national reform initiative called Achieving the Dream (ATD). This report describes the progress each college made in implementing ATD's "culture of evidence" principles for institutional improvement, examines strategies implemented by the colleges to…
Open-loop-feedback control of serum drug concentrations: pharmacokinetic approaches to drug therapy.
Jelliffe, R W
1983-01-01
Recent developments to optimize open-loop-feedback control of drug dosage regimens, generally applicable to pharmacokinetically oriented therapy with many drugs, involve computation of patient-individualized strategies for obtaining desired serum drug concentrations. Analyses of past therapy are performed by least squares, extended least squares, and maximum a posteriori probability Bayesian methods of fitting pharmacokinetic models to serum level data. Future possibilities for truly optimal open-loop-feedback therapy with full Bayesian methods, and conceivably for optimal closed-loop therapy in such data-poor clinical situations, are also discussed. Implementation of these various therapeutic strategies, using automated, locally controlled infusion devices, has also been achieved in prototype form.
An acceleration system for Laplacian image fusion based on SoC
NASA Astrophysics Data System (ADS)
Gao, Liwen; Zhao, Hongtu; Qu, Xiujie; Wei, Tianbo; Du, Peng
2018-04-01
Based on the analysis of Laplacian image fusion algorithm, this paper proposes a partial pipelining and modular processing architecture, and a SoC based acceleration system is implemented accordingly. Full pipelining method is used for the design of each module, and modules in series form the partial pipelining with unified data formation, which is easy for management and reuse. Integrated with ARM processor, DMA and embedded bare-mental program, this system achieves 4 layers of Laplacian pyramid on the Zynq-7000 board. Experiments show that, with small resources consumption, a couple of 256×256 images can be fused within 1ms, maintaining a fine fusion effect at the same time.
Okamoto, Takumi; Koide, Tetsushi; Sugi, Koki; Shimizu, Tatsuya; Anh-Tuan Hoang; Tamaki, Toru; Raytchev, Bisser; Kaneda, Kazufumi; Kominami, Yoko; Yoshida, Shigeto; Mieno, Hiroshi; Tanaka, Shinji
2015-08-01
With the increase of colorectal cancer patients in recent years, the needs of quantitative evaluation of colorectal cancer are increased, and the computer-aided diagnosis (CAD) system which supports doctor's diagnosis is essential. In this paper, a hardware design of type identification module in CAD system for colorectal endoscopic images with narrow band imaging (NBI) magnification is proposed for real-time processing of full high definition image (1920 × 1080 pixel). A pyramid style image segmentation with SVMs for multi-size scan windows, which can be implemented on an FPGA with small circuit area and achieve high accuracy, is proposed for actual complex colorectal endoscopic images.
Performance of concatenated Reed-Solomon trellis-coded modulation over Rician fading channels
NASA Technical Reports Server (NTRS)
Moher, Michael L.; Lodge, John H.
1990-01-01
A concatenated coding scheme for providing very reliable data over mobile-satellite channels at power levels similar to those used for vocoded speech is described. The outer code is a shorter Reed-Solomon code which provides error detection as well as error correction capabilities. The inner code is a 1-D 8-state trellis code applied independently to both the inphase and quadrature channels. To achieve the full error correction potential of this inner code, the code symbols are multiplexed with a pilot sequence which is used to provide dynamic channel estimation and coherent detection. The implementation structure of this scheme is discussed and its performance is estimated.
The liberal party and the achievement of national Medicare.
Bryden, P E
2009-01-01
The process that led to the implementation of a full national health insurance system in Canada was as complicated and contested as the battles that were fought over Medicare in Saskatchewan. The federal Liberal party had to first adopt health insurance as a serious component of its electoral platform, devise a strategy for dealing with provinces which had constitutional jurisdiction over health, and finally wrestle with those within the party--and within the cabinet--who continued to question whether Canada was financially prepared to administer such a costly program. The strategies were devised and the battles were fought privately, but had an important effect on the timing and shape of a national health insurance system.
Optimal estimation of entanglement in optical qubit systems
NASA Astrophysics Data System (ADS)
Brida, Giorgio; Degiovanni, Ivo P.; Florio, Angela; Genovese, Marco; Giorda, Paolo; Meda, Alice; Paris, Matteo G. A.; Shurupov, Alexander P.
2011-05-01
We address the experimental determination of entanglement for systems made of a pair of polarization qubits. We exploit quantum estimation theory to derive optimal estimators, which are then implemented to achieve ultimate bound to precision. In particular, we present a set of experiments aimed at measuring the amount of entanglement for states belonging to different families of pure and mixed two-qubit two-photon states. Our scheme is based on visibility measurements of quantum correlations and achieves the ultimate precision allowed by quantum mechanics in the limit of Poissonian distribution of coincidence counts. Although optimal estimation of entanglement does not require the full tomography of the states we have also performed state reconstruction using two different sets of tomographic projectors and explicitly shown that they provide a less precise determination of entanglement. The use of optimal estimators also allows us to compare and statistically assess the different noise models used to describe decoherence effects occurring in the generation of entanglement.
A streamlined artificial variable free version of simplex method.
Inayatullah, Syed; Touheed, Nasir; Imtiaz, Muhammad
2015-01-01
This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new method has also been presented which provides a way to easily implement the phase 1 of traditional dual simplex method. For a problem having an initial basis which is both primal and dual infeasible, our methods provide full freedom to the user, that whether to start with primal artificial free version or dual artificial free version without making any reformulation to the LP structure. Last but not the least, it provides a teaching aid for the teachers who want to teach feasibility achievement as a separate topic before teaching optimality achievement.
A Streamlined Artificial Variable Free Version of Simplex Method
Inayatullah, Syed; Touheed, Nasir; Imtiaz, Muhammad
2015-01-01
This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new method has also been presented which provides a way to easily implement the phase 1 of traditional dual simplex method. For a problem having an initial basis which is both primal and dual infeasible, our methods provide full freedom to the user, that whether to start with primal artificial free version or dual artificial free version without making any reformulation to the LP structure. Last but not the least, it provides a teaching aid for the teachers who want to teach feasibility achievement as a separate topic before teaching optimality achievement. PMID:25767883
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angerer, Andreas, E-mail: andreas.angerer@tuwien.ac.at; Astner, Thomas; Wirtitsch, Daniel
We design and implement 3D-lumped element microwave cavities that spatially focus magnetic fields to a small mode volume. They allow coherent and uniform coupling to electron spins hosted by nitrogen vacancy centers in diamond. We achieve large homogeneous single spin coupling rates, with an enhancement of more than one order of magnitude compared to standard 3D cavities with a fundamental resonance at 3 GHz. Finite element simulations confirm that the magnetic field distribution is homogeneous throughout the entire sample volume, with a root mean square deviation of 1.54%. With a sample containing 10{sup 17} nitrogen vacancy electron spins, we achieve amore » collective coupling strength of Ω = 12 MHz, a cooperativity factor C = 27, and clearly enter the strong coupling regime. This allows to interface a macroscopic spin ensemble with microwave circuits, and the homogeneous Rabi frequency paves the way to manipulate the full ensemble population in a coherent way.« less
Interdisciplinary applications and interpretations of ERTS data within the Susquehanna River basin
NASA Technical Reports Server (NTRS)
Mcmurtry, G. J.; Petersen, G. W. (Principal Investigator)
1975-01-01
The author has identified the following significant results. The full potential of high quality data is achieved only with the application of efficient and effective interpretation techniques. An excellent operating system for handling, processing, and interpreting ERTS-1 and other MSS data was achieved. Programs for processing digital data are implemented on a large nondedicated general purpose computer. Significant results were attained in mapping land use, agricultural croplands, forest resources, and vegetative cover. Categories of land use classified and mapped depend upon the geographic location, the detail required, and the types of lands use of interest. Physiographic and structural provinces are spectacularly displayed on ERTS-1 MSS image mosaics. Geologic bedrock structures show up well and formation contacts can sometimes be traced for hundreds of kilometers. Large circular structures and regional features, previously obscured by the detail of higher resolution data, can be seen. Environmental monitoring was performed in three areas: coal strip mining, coal refuse problems, and damage to vegetation caused by insects and pollution.
Very high frame rate volumetric integration of depth images on mobile devices.
Kähler, Olaf; Adrian Prisacariu, Victor; Yuheng Ren, Carl; Sun, Xin; Torr, Philip; Murray, David
2015-11-01
Volumetric methods provide efficient, flexible and simple ways of integrating multiple depth images into a full 3D model. They provide dense and photorealistic 3D reconstructions, and parallelised implementations on GPUs achieve real-time performance on modern graphics hardware. To run such methods on mobile devices, providing users with freedom of movement and instantaneous reconstruction feedback, remains challenging however. In this paper we present a range of modifications to existing volumetric integration methods based on voxel block hashing, considerably improving their performance and making them applicable to tablet computer applications. We present (i) optimisations for the basic data structure, and its allocation and integration; (ii) a highly optimised raycasting pipeline; and (iii) extensions to the camera tracker to incorporate IMU data. In total, our system thus achieves frame rates up 47 Hz on a Nvidia Shield Tablet and 910 Hz on a Nvidia GTX Titan XGPU, or even beyond 1.1 kHz without visualisation.
Automatic Overset Grid Generation with Heuristic Feedback Control
NASA Technical Reports Server (NTRS)
Robinson, Peter I.
2001-01-01
An advancing front grid generation system for structured Overset grids is presented which automatically modifies Overset structured surface grids and control lines until user-specified grid qualities are achieved. The system is demonstrated on two examples: the first refines a space shuttle fuselage control line until global truncation error is achieved; the second advances, from control lines, the space shuttle orbiter fuselage top and fuselage side surface grids until proper overlap is achieved. Surface grids are generated in minutes for complex geometries. The system is implemented as a heuristic feedback control (HFC) expert system which iteratively modifies the input specifications for Overset control line and surface grids. It is developed as an extension of modern control theory, production rules systems and subsumption architectures. The methodology provides benefits over the full knowledge lifecycle of an expert system for knowledge acquisition, knowledge representation, and knowledge execution. The vector/matrix framework of modern control theory systematically acquires and represents expert system knowledge. Missing matrix elements imply missing expert knowledge. The execution of the expert system knowledge is performed through symbolic execution of the matrix algebra equations of modern control theory. The dot product operation of matrix algebra is generalized for heuristic symbolic terms. Constant time execution is guaranteed.
Integration at the round table: marine spatial planning in multi-stakeholder settings.
Olsen, Erik; Fluharty, David; Hoel, Alf Håkon; Hostens, Kristian; Maes, Frank; Pecceu, Ellen
2014-01-01
Marine spatial planning (MSP) is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments), economic activities (and related priorities), external drivers, spatial scales, incentives and objectives, varying approaches to legislation and political will. We compared MSP in Belgium, Norway and the US to illustrate how the integration of stakeholders and governmental levels differs among these countries along the factors mentioned above. Horizontal integration (between sectors) is successful in all three countries, achieved through the use of neutral 'round-table' meeting places for all actors. Vertical integration between government levels varies, with Belgium and Norway having achieved full integration while the US lacks integration of the legislature due to sharp disagreements among stakeholders and unsuccessful partisan leadership. Success factors include political will and leadership, process transparency and stakeholder participation, and should be considered in all MSP development processes.
Integration at the Round Table: Marine Spatial Planning in Multi-Stakeholder Settings
Olsen, Erik; Fluharty, David; Hoel, Alf Håkon; Hostens, Kristian; Maes, Frank; Pecceu, Ellen
2014-01-01
Marine spatial planning (MSP) is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments), economic activities (and related priorities), external drivers, spatial scales, incentives and objectives, varying approaches to legislation and political will. We compared MSP in Belgium, Norway and the US to illustrate how the integration of stakeholders and governmental levels differs among these countries along the factors mentioned above. Horizontal integration (between sectors) is successful in all three countries, achieved through the use of neutral ‘round-table’ meeting places for all actors. Vertical integration between government levels varies, with Belgium and Norway having achieved full integration while the US lacks integration of the legislature due to sharp disagreements among stakeholders and unsuccessful partisan leadership. Success factors include political will and leadership, process transparency and stakeholder participation, and should be considered in all MSP development processes. PMID:25299595
A process to help assure successful commercial space ventures
NASA Astrophysics Data System (ADS)
Mihara, Sam K.
1999-01-01
The purpose of this paper is to describe a process for successful space business ventures-a methodology used by highly successful commercial ventures, but relatively new to space business enterprises. What do highly successful commercial business ventures have in common? How do these companies differ from most commercial space ventures? The answer is the implementation of a state-of-the-art customer satisfaction process. Take the case of the latest winners of the Malcolm Baldrige National Quality Award. What did they do that helped to achieve this performance? The answer is they implemented an effective process that measures and achieves the highest possible level of customer satisfaction. The same process can be implemented by space enterprises to achieve comparable commercial results. This paper describes the six-step process, including examples of each step. It concludes with the strong recommendation that this process be implemented to assure success in the commercial space world.
A Wireless Headstage for Combined Optogenetics and Multichannel Electrophysiological Recording.
Gagnon-Turcotte, Gabriel; LeChasseur, Yoan; Bories, Cyril; Messaddeq, Younes; De Koninck, Yves; Gosselin, Benoit
2017-02-01
This paper presents a wireless headstage with real-time spike detection and data compression for combined optogenetics and multichannel electrophysiological recording. The proposed headstage, which is intended to perform both optical stimulation and electrophysiological recordings simultaneously in freely moving transgenic rodents, is entirely built with commercial off-the-shelf components, and includes 32 recording channels and 32 optical stimulation channels. It can detect, compress and transmit full action potential waveforms over 32 channels in parallel and in real time using an embedded digital signal processor based on a low-power field programmable gate array and a Microblaze microprocessor softcore. Such a processor implements a complete digital spike detector featuring a novel adaptive threshold based on a Sigma-delta control loop, and a wavelet data compression module using a new dynamic coefficient re-quantization technique achieving large compression ratios with higher signal quality. Simultaneous optical stimulation and recording have been performed in-vivo using an optrode featuring 8 microelectrodes and 1 implantable fiber coupled to a 465-nm LED, in the somatosensory cortex and the Hippocampus of a transgenic mouse expressing ChannelRhodospin (Thy1::ChR2-YFP line 4) under anesthetized conditions. Experimental results show that the proposed headstage can trigger neuron activity while collecting, detecting and compressing single cell microvolt amplitude activity from multiple channels in parallel while achieving overall compression ratios above 500. This is the first reported high-channel count wireless optogenetic device providing simultaneous optical stimulation and recording. Measured characteristics show that the proposed headstage can achieve up to 100% of true positive detection rate for signal-to-noise ratio (SNR) down to 15 dB, while achieving up to 97.28% at SNR as low as 5 dB. The implemented prototype features a lifespan of up to 105 minutes, and uses a lightweight (2.8 g) and compact [Formula: see text] rigid-flex printed circuit board.
An Indigenous Framework for Science, Technology, Engineering and Mathematics
NASA Astrophysics Data System (ADS)
Monette, G.
2003-12-01
The American Indian Higher Education Consortium, composed of 35 American Indian tribally-controlled Colleges and Universities in the U.S. and Canada, is leading a comprehensive effort to improve American Indian student achievement in STEM. A key component of this effort is the synthesis of indigenous ways of knowing and western education systems. This presentation will provide an overview of culturally responsive, place-based teaching, learning, and research and will discuss potential opportunities and strategies for helping to ensure that education systems and research programs reflect our diversity and respect our cultures. One example to be discussed is the NSF-funded "Tribal College Rural Systemic Initiative." Founded on the belief that all students can learn and should be given the opportunity to reach their full potential, Tribal Colleges are leading this effort to achieve successful and sustainable improvement of science, math, and technology education at the K-14 level in rural, economically disadvantaged, geographically challenged areas. Working with parents, tribal governments, schools and the private sector, the colleges are helping to implement math and science standards-based curriculum for students and standards-based assessment for schools; provide math and science standards-based professional development for teachers, administrators, and community leaders; and integrate local Native culture into math and science standards-based curriculum. The close working relationship between the Tribal Colleges and K-12 is paying off. According to the National Science Foundation, successful systemic reform has resulted in enhanced student achievement and participation in science and math; reductions in the achievement disparities among students that can be attributed to socioeconomic status, race, ethnicity, gender, or learning styles; implementation of a comprehensive, standards-based curriculum aligned with instructions and assessment; development of a coherent, consistent set of policies that supports high quality math and science education for each student; convergence of science and math resource; and broad-based support from parents and the community.
Geophysics field school: A team-based learning experience for students and faculty
NASA Astrophysics Data System (ADS)
Karchewski, B.; Innanen, K. A.; Lauer, R. M.; Pidlisecky, A.
2016-12-01
The core challenge facing a modern science educator is to deliver a curriculum that reaches broadly and deeply into the technical domain, while also helping students to develop fundamental scientific skills such as inquiry, critical thinking and technical communication. That is, our aim is for students to achieve significant learning at all levels summarized by Bloom's Taxonomy of Educational Objectives. It is not always clear how to achieve the full spectrum of goals, with much debate over which component is more important in a science education. Team-based and experiential learning are research-supported approaches that aim to reach across the spectrum by placing students in a setting where they solve practical problems in teams of peers. This learning mode modifies the role of the instructor to a guide or facilitator, and students take a leadership role in their own education. We present a case study of our team's implementation of team-based learning in a geophysics field school, an inherently experiential learning environment. The core philosophies behind our implementation are to present clearly defined learning outcomes, to recognize that students differ in their learning modalities and to strive to engage students through a range of evidence-based learning experiences. We discuss the techniques employed to create functional teams, the key learning activities involved in a typical day of field school and data demonstrating the learning activities that showed the strongest correlation to overall performance in the course. In the process, we also realized that our team-based approach to course design and implementation also enhanced our skillsets as educators, and our institution recently recognized our efforts with a team teaching award. Therefore, we conclude with some of our observations of best practices for team teaching in a field setting to initiate discussions with colleagues engaged in similar activities.
Weaver, Robert G; Moore, Justin B; Turner-McGrievy, Brie; Saunders, Ruth; Beighle, Aaron; Khan, M Mahmud; Chandler, Jessica; Brazendale, Keith; Randell, Allison; Webster, Collin; Beets, Michael W
2017-08-01
The YMCA of USA has adopted Healthy Eating and Physical Activity (HEPA) Standards for its afterschool programs (ASPs). Little is known about strategies YMCA ASPs are implementing to achieve Standards and these strategies' effectiveness. (1) Identify strategies implemented in YMCA ASPs and (2) evaluate the relationship between strategy implementation and meeting Standards. HEPA was measured via accelerometer (moderate-to-vigorous-physical-activity [MVPA]) and direct observation (snacks served) in 20 ASPs. Strategies were identified and mapped onto a capacity building framework ( Strategies To Enhance Practice [STEPs]). Mixed-effects regression estimated increases in HEPA outcomes as implementation increased. Model-implied estimates were calculated for high (i.e., highest implementation score achieved), moderate (median implementation score across programs), and low (lowest implementation score achieved) implementation for both HEPA separately. Programs implemented a variety of strategies identified in STEPs. For every 1-point increase in implementation score 1.45% (95% confidence interval = 0.33% to 2.55%, p ≤ .001) more girls accumulated 30 min/day of MVPA and fruits and/or vegetables were served on 0.11 more days (95% confidence interval = 0.11-0.45, p ≤ .01). Relationships between implementation and other HEPA outcomes did not reach statistical significance. Still regression estimates indicated that desserts are served on 1.94 fewer days (i.e., 0.40 vs. 2.34) in the highest implementing program than the lowest implementing program and water is served 0.73 more days (i.e., 2.37 vs. 1.64). Adopting HEPA Standards at the national level does not lead to changes in routine practice in all programs. Practical strategies that programs could adopt to more fully comply with the HEPA Standards are identified.
Nishioka, Shinta; Okamoto, Takatsugu; Takayama, Masako; Urushihara, Maki; Watanabe, Misuzu; Kiriya, Yumiko; Shintani, Keiko; Nakagomi, Hiromi; Kageyama, Noriko
2017-08-01
Whether malnutrition risk correlates with recovery of swallowing function of convalescent stroke patients is unknown. This study was conducted to clarify whether malnutrition risks predict achievement of full oral intake in convalescent stroke patients undergoing enteral nutrition. We conducted a secondary analysis of 466 convalescent stroke patients, aged 65 years or over, who were undergoing enteral nutrition. Patients were extracted from the "Algorithm for Post-stroke Patients to improve oral intake Level; APPLE" study database compiled at the Kaifukuki (convalescent) rehabilitation wards. Malnutrition risk was determined by the Geriatric Nutritional Risk Index as follows: severe (<82), moderate (82 to <92), mild (92 to <98), and no malnutrition risks (≥98). Swallowing function was assessed by Fujishima's swallowing grade (FSG) on admission and discharge. The primary outcome was achievement of full oral intake, indicated by FSG ≥ 7. Binary logistic regression analysis was performed to identify predictive factors, including malnutrition risk, for achieving full oral intake. Estimated hazard risk was computed by Cox's hazard model. Of the 466 individuals, 264 were ultimately included in this study. Participants with severe malnutrition risk showed a significantly lower proportion of achievement of full oral intake than lower severity groups (P = 0.001). After adjusting for potential confounders, binary logistic regression analysis showed that patients with severe malnutrition risk were less likely to achieve full oral intake (adjusted odds ratio: 0.232, 95% confidence interval [95% CI]: 0.047-1.141). Cox's proportional hazard model revealed that severe malnutrition risk was an independent predictor of full oral intake (adjusted hazard ratio: 0.374, 95% CI: 0.166-0.842). Compared to patients who did not achieve full oral intake, patients who achieved full oral intake had significantly higher energy intake, but there was no difference in protein intake and weight change. Severe malnutrition risk independently predicts the achievement of full oral intake in convalescent stroke patients undergoing enteral nutrition. Copyright © 2016 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Yan, Hao; Wang, Xiaoyu; Shi, Feng; Bai, Ti; Folkerts, Michael; Cervino, Laura; Jiang, Steve B.; Jia, Xun
2014-01-01
Purpose: Compressed sensing (CS)-based iterative reconstruction (IR) techniques are able to reconstruct cone-beam CT (CBCT) images from undersampled noisy data, allowing for imaging dose reduction. However, there are a few practical concerns preventing the clinical implementation of these techniques. On the image quality side, data truncation along the superior–inferior direction under the cone-beam geometry produces severe cone artifacts in the reconstructed images. Ring artifacts are also seen in the half-fan scan mode. On the reconstruction efficiency side, the long computation time hinders clinical use in image-guided radiation therapy (IGRT). Methods: Image quality improvement methods are proposed to mitigate the cone and ring image artifacts in IR. The basic idea is to use weighting factors in the IR data fidelity term to improve projection data consistency with the reconstructed volume. In order to improve the computational efficiency, a multiple graphics processing units (GPUs)-based CS-IR system was developed. The parallelization scheme, detailed analyses of computation time at each step, their relationship with image resolution, and the acceleration factors were studied. The whole system was evaluated in various phantom and patient cases. Results: Ring artifacts can be mitigated by properly designing a weighting factor as a function of the spatial location on the detector. As for the cone artifact, without applying a correction method, it contaminated 13 out of 80 slices in a head-neck case (full-fan). Contamination was even more severe in a pelvis case under half-fan mode, where 36 out of 80 slices were affected, leading to poorer soft tissue delineation and reduced superior–inferior coverage. The proposed method effectively corrects those contaminated slices with mean intensity differences compared to FDK results decreasing from ∼497 and ∼293 HU to ∼39 and ∼27 HU for the full-fan and half-fan cases, respectively. In terms of efficiency boost, an overall 3.1 × speedup factor has been achieved with four GPU cards compared to a single GPU-based reconstruction. The total computation time is ∼30 s for typical clinical cases. Conclusions: The authors have developed a low-dose CBCT IR system for IGRT. By incorporating data consistency-based weighting factors in the IR model, cone/ring artifacts can be mitigated. A boost in computational efficiency is achieved by multi-GPU implementation. PMID:25370645
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Hao, E-mail: steve.jiang@utsouthwestern.edu, E-mail: xun.jia@utsouthwestern.edu; Shi, Feng; Jiang, Steve B.
Purpose: Compressed sensing (CS)-based iterative reconstruction (IR) techniques are able to reconstruct cone-beam CT (CBCT) images from undersampled noisy data, allowing for imaging dose reduction. However, there are a few practical concerns preventing the clinical implementation of these techniques. On the image quality side, data truncation along the superior–inferior direction under the cone-beam geometry produces severe cone artifacts in the reconstructed images. Ring artifacts are also seen in the half-fan scan mode. On the reconstruction efficiency side, the long computation time hinders clinical use in image-guided radiation therapy (IGRT). Methods: Image quality improvement methods are proposed to mitigate the conemore » and ring image artifacts in IR. The basic idea is to use weighting factors in the IR data fidelity term to improve projection data consistency with the reconstructed volume. In order to improve the computational efficiency, a multiple graphics processing units (GPUs)-based CS-IR system was developed. The parallelization scheme, detailed analyses of computation time at each step, their relationship with image resolution, and the acceleration factors were studied. The whole system was evaluated in various phantom and patient cases. Results: Ring artifacts can be mitigated by properly designing a weighting factor as a function of the spatial location on the detector. As for the cone artifact, without applying a correction method, it contaminated 13 out of 80 slices in a head-neck case (full-fan). Contamination was even more severe in a pelvis case under half-fan mode, where 36 out of 80 slices were affected, leading to poorer soft tissue delineation and reduced superior–inferior coverage. The proposed method effectively corrects those contaminated slices with mean intensity differences compared to FDK results decreasing from ∼497 and ∼293 HU to ∼39 and ∼27 HU for the full-fan and half-fan cases, respectively. In terms of efficiency boost, an overall 3.1 × speedup factor has been achieved with four GPU cards compared to a single GPU-based reconstruction. The total computation time is ∼30 s for typical clinical cases. Conclusions: The authors have developed a low-dose CBCT IR system for IGRT. By incorporating data consistency-based weighting factors in the IR model, cone/ring artifacts can be mitigated. A boost in computational efficiency is achieved by multi-GPU implementation.« less
ERIC Educational Resources Information Center
Pinger, Petra; Rakoczy, Katrin; Besser, Michael; Klieme, Eckhard
2018-01-01
The aim of this study was to contribute to the understanding of the effectiveness of formative assessment interventions by analysing how the "quality of programme delivery" affects students' mathematics achievement and interest. Teachers (n = 17) implemented formative assessment in their ninth-grade mathematics classes and provided their…
ERIC Educational Resources Information Center
Gates, Susan M.; Hamilton, Laura S.; Martorell, Paco; Burkhauser, Susan; Heaton, Paul; Pierson, Ashley; Baird, Matthew; Vuollo, Mirka; Li, Jennifer J.; Lavery, Diana Catherine; Harvey, Melody; Gu, Kun
2014-01-01
New Leaders is dedicated to promoting student achievement by developing outstanding school leaders to serve in urban schools. RAND Corporation researchers conducted a formative and summative external evaluation of the New Leaders program, its theory of action, and its implementation from 2006 through 2013. This document presents technical…
ERIC Educational Resources Information Center
Pastchal-Temple, Andrea Sheree
2012-01-01
Many school districts are using research-based strategies to increase student achievement. The "No Child Left Behind Act" of 2001 was created and implemented to assist all students becoming proficient in reading and mathematics by 2014. One strategy many school districts implemented includes an after-school program. One school district…
ERIC Educational Resources Information Center
Widiana, I. Wayan; Jampel, I. Nyoman
2016-01-01
This classroom action research aimed to improve the students' creative thinking and achievement in learning science. It conducted through the implementation of multiple intelligences with mind mapping approach and describing the students' responses. The subjects of this research were the fifth grade students of SD 8 Tianyar Barat, Kubu, and…
ERIC Educational Resources Information Center
Vogel, Linda R.
2010-01-01
Standards-based education (SBE) has been the dominant educational reform movement since the early 1980s, reinforced by federal and state accountability systems. This book examines the efforts of educational leaders in implementing SBE to improve student achievement in a variety of demographic contexts but with common challenges. Four stages of SBE…
ERIC Educational Resources Information Center
Garrett, Stacie
2017-01-01
Differentiated instruction (DI) is a curriculum framework that focuses on the individual student. Students achieve because teachers develop lessons to the students' readiness levels, interests, and learning styles. Students in early childhood through college have shown increased achievement when DI is implemented. The purpose of this quantitative,…
ERIC Educational Resources Information Center
McCormick, Meghan P.; Cappella, Elise; O'Connor, Erin E.; McClowry, Sandee G.
2015-01-01
Given established links between social-emotional skills and academic achievement, there is growing support for implementing universal social/behavioral interventions in early schooling (Jones & Bouffard, 2012). Advocates have been particularly interested in implementing such programming in low income urban schools where students are likely to…
NASA Astrophysics Data System (ADS)
Sultan, A. Z.; Hamzah, N.; Rusdi, M.
2018-01-01
The implementation of concept attainment method based on simulation was used to increase student’s interest in the subjects Engineering of Mechanics in second semester of academic year 2016/2017 in Manufacturing Engineering Program, Department of Mechanical PNUP. The result of the implementation of this learning method shows that there is an increase in the students’ learning interest towards the lecture material which is summarized in the form of interactive simulation CDs and teaching materials in the form of printed books and electronic books. From the implementation of achievement method of this simulation based concept, it is noted that the increase of student participation in the presentation and discussion as well as the deposit of individual assignment of significant student. With the implementation of this method of learning the average student participation reached 89%, which before the application of this learning method only reaches an average of 76%. And also with previous learning method, for exam achievement of A-grade under 5% and D-grade above 8%. After the implementation of the new learning method (simulation based-concept attainment method) the achievement of Agrade has reached more than 30% and D-grade below 1%.
Papa, Jillian; Rodriguez, Gertrudes; Robinson, Deborah
2017-01-01
Introduction Obesity is a major health concern in every US age group. Approximately one in 4 children in Arizona’s Special Supplemental Nutrition Program for Women, Infants, and Children is overweight or obese. The Arizona Department of Health Services developed the Empower program to promote healthy environments in licensed child care facilities. The program consists of 10 standards, including one standard for each of these 5 areas: physical activity and screen time, breastfeeding, fruit juice and water, family-style meals, and staff training. The objective of this evaluation was to determine the level of implementation of these 5 Empower standards. Methods A self-assessment survey was completed from July 2013 through June 2015 by 1,850 facilities to evaluate the level of implementation of 5 Empower standards. We calculated the percentage of facilities that reported the degree to which they implemented each standard and identified common themes in comments recorded in the survey. Results All facilities reported either full or partial implementation of the 5 standards. Of 1,678 facilities, 21.7% (n = 364) reported full implementation of all standards, and 78.3% (n = 1,314) reported at least partial implementation. Staff training, which has only one component, had the highest level of implementation: 77.4% (n = 1,299) reported full implementation. Only 44.0% (n = 738) reported full implementation of the standard on a breastfeeding-friendly environment. Conclusion Arizona child care facilities have begun to implement the Empower program, but facilities will need more education, technical assistance, and support in some areas to fully implement the program. PMID:28880840
Papa, Jillian; Agostinelli, Joan; Rodriguez, Gertrudes; Robinson, Deborah
2017-09-07
Obesity is a major health concern in every US age group. Approximately one in 4 children in Arizona's Special Supplemental Nutrition Program for Women, Infants, and Children is overweight or obese. The Arizona Department of Health Services developed the Empower program to promote healthy environments in licensed child care facilities. The program consists of 10 standards, including one standard for each of these 5 areas: physical activity and screen time, breastfeeding, fruit juice and water, family-style meals, and staff training. The objective of this evaluation was to determine the level of implementation of these 5 Empower standards. A self-assessment survey was completed from July 2013 through June 2015 by 1,850 facilities to evaluate the level of implementation of 5 Empower standards. We calculated the percentage of facilities that reported the degree to which they implemented each standard and identified common themes in comments recorded in the survey. All facilities reported either full or partial implementation of the 5 standards. Of 1,678 facilities, 21.7% (n = 364) reported full implementation of all standards, and 78.3% (n = 1,314) reported at least partial implementation. Staff training, which has only one component, had the highest level of implementation: 77.4% (n = 1,299) reported full implementation. Only 44.0% (n = 738) reported full implementation of the standard on a breastfeeding-friendly environment. Arizona child care facilities have begun to implement the Empower program, but facilities will need more education, technical assistance, and support in some areas to fully implement the program.
Building robust conservation plans.
Visconti, Piero; Joppa, Lucas
2015-04-01
Systematic conservation planning optimizes trade-offs between biodiversity conservation and human activities by accounting for socioeconomic costs while aiming to achieve prescribed conservation objectives. However, the most cost-efficient conservation plan can be very dissimilar to any other plan achieving the set of conservation objectives. This is problematic under conditions of implementation uncertainty (e.g., if all or part of the plan becomes unattainable). We determined through simulations of parallel implementation of conservation plans and habitat loss the conditions under which optimal plans have limited chances of implementation and where implementation attempts would fail to meet objectives. We then devised a new, flexible method for identifying conservation priorities and scheduling conservation actions. This method entails generating a number of alternative plans, calculating the similarity in site composition among all plans, and selecting the plan with the highest density of neighboring plans in similarity space. We compared our method with the classic method that maximizes cost efficiency with synthetic and real data sets. When implementation was uncertain--a common reality--our method provided higher likelihood of achieving conservation targets. We found that χ, a measure of the shortfall in objectives achieved by a conservation plan if the plan could not be implemented entirely, was the main factor determining the relative performance of a flexibility enhanced approach to conservation prioritization. Our findings should help planning authorities prioritize conservation efforts in the face of uncertainty about future condition and availability of sites. © 2014 Society for Conservation Biology.
Patients first! Engaging the hearts and minds of nurses with a patient-centered practice model.
Small, Deborah C; Small, Robert M
2011-05-31
Like every healthcare system today, the Cleveland Clinic health system is a combination of medical hospitals, institutes, and services in which the implementation of uniform care methodologies faces significant barriers. The guiding principle of the Cleveland Clinic, 'Patients First,' focuses on the principle of patient- and family-centered care (PFCC) but deliberately lacks details due to the wide scope of care delivered by the organization. The Stanley Shalom Zielony Institute of Nursing Excellence (the Nursing Institute) at the Cleveland Clinic was charged with standardizing nursing practice across a system with 11,000 registered nurses and 800 advanced practice nurses. The challenge involved providing firm direction on delivering PFCC that was appropriate for all clinical disciplines and could be implemented quickly across existing practices and technologies. Successful implementation required full engagement in the concept of PFCC by what the Institute for Healthcare Improvement has termed the 'hearts and minds' of nurses. To achieve these ends, development of a systemwide nursing practice model was initiated. In this article the authors identify the essence of PFCC, consider barriers to PFCC, review their process of developing PFCC, and describe how the Cleveland Clinic health system has implemented a PFCC nursing practice model. In doing so the authors explore how the concept of 'Passion for Nursing' was used to stimulate nurse engagement in PFCC.
Efficient Implementation of MrBayes on Multi-GPU
Zhou, Jianfu; Liu, Xiaoguang; Wang, Gang
2013-01-01
MrBayes, using Metropolis-coupled Markov chain Monte Carlo (MCMCMC or (MC)3), is a popular program for Bayesian inference. As a leading method of using DNA data to infer phylogeny, the (MC)3 Bayesian algorithm and its improved and parallel versions are now not fast enough for biologists to analyze massive real-world DNA data. Recently, graphics processor unit (GPU) has shown its power as a coprocessor (or rather, an accelerator) in many fields. This article describes an efficient implementation a(MC)3 (aMCMCMC) for MrBayes (MC)3 on compute unified device architecture. By dynamically adjusting the task granularity to adapt to input data size and hardware configuration, it makes full use of GPU cores with different data sets. An adaptive method is also developed to split and combine DNA sequences to make full use of a large number of GPU cards. Furthermore, a new “node-by-node” task scheduling strategy is developed to improve concurrency, and several optimizing methods are used to reduce extra overhead. Experimental results show that a(MC)3 achieves up to 63× speedup over serial MrBayes on a single machine with one GPU card, and up to 170× speedup with four GPU cards, and up to 478× speedup with a 32-node GPU cluster. a(MC)3 is dramatically faster than all the previous (MC)3 algorithms and scales well to large GPU clusters. PMID:23493260
Efficient implementation of MrBayes on multi-GPU.
Bao, Jie; Xia, Hongju; Zhou, Jianfu; Liu, Xiaoguang; Wang, Gang
2013-06-01
MrBayes, using Metropolis-coupled Markov chain Monte Carlo (MCMCMC or (MC)(3)), is a popular program for Bayesian inference. As a leading method of using DNA data to infer phylogeny, the (MC)(3) Bayesian algorithm and its improved and parallel versions are now not fast enough for biologists to analyze massive real-world DNA data. Recently, graphics processor unit (GPU) has shown its power as a coprocessor (or rather, an accelerator) in many fields. This article describes an efficient implementation a(MC)(3) (aMCMCMC) for MrBayes (MC)(3) on compute unified device architecture. By dynamically adjusting the task granularity to adapt to input data size and hardware configuration, it makes full use of GPU cores with different data sets. An adaptive method is also developed to split and combine DNA sequences to make full use of a large number of GPU cards. Furthermore, a new "node-by-node" task scheduling strategy is developed to improve concurrency, and several optimizing methods are used to reduce extra overhead. Experimental results show that a(MC)(3) achieves up to 63× speedup over serial MrBayes on a single machine with one GPU card, and up to 170× speedup with four GPU cards, and up to 478× speedup with a 32-node GPU cluster. a(MC)(3) is dramatically faster than all the previous (MC)(3) algorithms and scales well to large GPU clusters.
Calibration of a complex activated sludge model for the full-scale wastewater treatment plant.
Liwarska-Bizukojc, Ewa; Olejnik, Dorota; Biernacki, Rafal; Ledakowicz, Stanislaw
2011-08-01
In this study, the results of the calibration of the complex activated sludge model implemented in BioWin software for the full-scale wastewater treatment plant are presented. Within the calibration of the model, sensitivity analysis of its parameters and the fractions of carbonaceous substrate were performed. In the steady-state and dynamic calibrations, a successful agreement between the measured and simulated values of the output variables was achieved. Sensitivity analysis revealed that upon the calculations of normalized sensitivity coefficient (S(i,j)) 17 (steady-state) or 19 (dynamic conditions) kinetic and stoichiometric parameters are sensitive. Most of them are associated with growth and decay of ordinary heterotrophic organisms and phosphorus accumulating organisms. The rankings of ten most sensitive parameters established on the basis of the calculations of the mean square sensitivity measure (δ(msqr)j) indicate that irrespective of the fact, whether the steady-state or dynamic calibration was performed, there is an agreement in the sensitivity of parameters.
An organized approach to the control of hazards to health at work.
Molyneux, M K; Wilson, H G
1990-04-01
Shell U.K. has an approach which facilitates the implementation of its occupational hygiene programme in its many locations. The main elements of the system are Company Policy, Standards, Methods and Management. The Policy sets the scene and is rigorous in its aims. The new COSHH legislation has emphasized particular duties which have influenced the approach. The Company Occupational Health Guidelines [Guidelines on Health at Work for Shell in the U.K. Shell U.K. Ltd, London (1989)] set the standards for control of exposure, among other things, and the Company adopts appropriate methods to achieve them. Of particular note is the Company's COSHH Programme [Implementation of the Shell U.K. Policy on the Control of Substances Hazardous to Health. Shell U.K. Ltd, London (1989)] which applies to all hazards to health (including physical and biological agents) in the workplace. Its introduction has been given full corporate support and is in the process of implementation. Appropriate procedures have been introduced for assessments of risk and for work histories. Guidance has been given on competence, reflecting a philosphy based on a team approach using local resources to the full, supported by corporate resources as required. The awards of the British Examining and Registration Board in Occupational Hygiene (1987) are used as the professional standard. Because of difficulties in obtaining basic hazard data, an internal core hazard data system (CHADS) [Core Hazard Data System. Shell U.K Ltd, London (1989)] has been introduced. The whole programme is managed through Occupational Hygiene Focal Points (OHFP) which represent local activities but also participate in corporate strategy. Through them the multidisciplinary approach is promoted, working in conjunction with local and sector Medical Advisers. Work done by the central Occupational Hygiene Unit is recorded and the reports are used for time management and recovery of costs. In its entirety, the approach is being used successfully to implement a comprehensive occupational hygiene programme in a diversified and dispersed industrial organization.
NASA Astrophysics Data System (ADS)
Dewi, L. P.; Djohar, A.
2018-04-01
This research is a study about implementation of the 2013 Curriculum on Chemistry subject. This study aims to determine the effect of teacher performance toward chemistry learning achievement. The research design involves the independent variable, namely the performance of Chemistry teacher, and the dependent variable that is Chemistry learning achievement which includes the achievement in knowledge and skill domain. The subject of this research are Chemistry teachers and High School students in Bandung City. The research data is obtained from questionnaire about teacher performance assessed by student and Chemistry learning achievement from the students’ report. Data were analyzed by using MANOVA test. The result of multivariate significance test shows that there is a significant effect of teacher performance toward Chemistry learning achievement in knowledge and skill domain with medium effect size.
NASA Astrophysics Data System (ADS)
Bagci, Fulya; Akaoglu, Baris
2018-05-01
In this study, a classical analogue of electromagnetically induced transparency (EIT) that is completely independent of the polarization direction of the incident waves is numerically and experimentally demonstrated. The unit cell of the employed planar symmetric metamaterial structure consists of one square ring resonator and four split ring resonators (SRRs). Two different designs are implemented in order to achieve a narrow-band and wide-band EIT-like response. In the unit cell design, a square ring resonator is shown to serve as a bright resonator, whereas the SRRs behave as a quasi-dark resonator, for the narrow-band (0.55 GHz full-width at half-maximum bandwidth around 5 GHz) and wide-band (1.35 GHz full-width at half-maximum bandwidth around 5.7 GHz) EIT-like metamaterials. The observed EIT-like transmission phenomenon is theoretically explained by a coupled-oscillator model. Within the transmission window, steep changes of the phase result in high group delays and the delay-bandwidth products reach 0.45 for the wide-band EIT-like metamaterial. Furthermore, it has been demonstrated that the bandwidth and group delay of the EIT-like band can be controlled by changing the incidence angle of electromagnetic waves. These features enable the proposed metamaterials to achieve potential applications in filtering, switching, data storing, and sensing.
Childhood vaccination: achievements and challenges.
Ndumbe, P
1996-09-01
As the goal of eradicating smallpox was being met, the World Health Organization created its Expanded Programme on Immunisation (EPI) in 1974 and reached its initial goal of achieving full vaccination of 80% of the world's children by 1990. This effort was aided by the creation of "cold chain" delivery systems and resulted in the annual saving of 3.5 million children in less-developed countries. Current EPI vaccination goals include 1) eradication of poliomyelitis by the year 2000, 2) elimination of neonatal tetanus by the year 1995, 3) control of measles and hepatitis B, and 4) immunization of 90% of the world's children 1 year or younger by the year 2000. Goals of the Children's Vaccine Initiative (formed in 1991) include 1) provision of an adequate supply of affordable, safe, and effective vaccines; 2) production of improved and new vaccines; and 3) simplification of the logistics of vaccine delivery. Future challenges are to sustain high vaccination coverage, reach the unreached, achieve proper storage of vaccines and reduce waste, integrate new vaccines into national programs, and achieve vaccine self-sufficiency. The fact that these challenges will be difficult to achieve is illustrated by the situation in Africa where the high immunization levels achieved in 1990 have dropped dramatically. Those who must act to implement immunization programs are health personnel, families, governments, and development partners. In order to achieve equity in health, every child must be reached, governments must be made accountable for programs, health workers must convince families of the importance of vaccination, delivery systems must be in place to take advantage of the new vaccines being delivered, and a multisectoral approach must be taken to assure sustainability.
Operationalizing the Student Electronic Portfolio for Doctoral Nursing Education.
Willmarth-Stec, Melissa; Beery, Teresa
2015-01-01
There is an increasing trend toward use of the electronic portfolio (e-portfolio) in Doctor of Nursing Practice programs. E-portfolios can provide documentation of competencies and achievement of program outcomes while showcasing a holistic view of the student achievement. Implementation of the e-portfolio requires careful decision making concerning software selection, set-up, portfolio components, and evaluation. The purpose of this article is to describe the implementation of an e-portfolio in a Doctor of Nursing Practice program and provide lessons learned during the implementation stage.
A Guide to Structured Illumination TIRF Microscopy at High Speed with Multiple Colors
Young, Laurence J.; Ströhl, Florian; Kaminski, Clemens F.
2016-01-01
Optical super-resolution imaging with structured illumination microscopy (SIM) is a key technology for the visualization of processes at the molecular level in the chemical and biomedical sciences. Although commercial SIM systems are available, systems that are custom designed in the laboratory can outperform commercial systems, the latter typically designed for ease of use and general purpose applications, both in terms of imaging fidelity and speed. This article presents an in-depth guide to building a SIM system that uses total internal reflection (TIR) illumination and is capable of imaging at up to 10 Hz in three colors at a resolution reaching 100 nm. Due to the combination of SIM and TIRF, the system provides better image contrast than rival technologies. To achieve these specifications, several optical elements are used to enable automated control over the polarization state and spatial structure of the illumination light for all available excitation wavelengths. Full details on hardware implementation and control are given to achieve synchronization between excitation light pattern generation, wavelength, polarization state, and camera control with an emphasis on achieving maximum acquisition frame rate. A step-by-step protocol for system alignment and calibration is presented and the achievable resolution improvement is validated on ideal test samples. The capability for video-rate super-resolution imaging is demonstrated with living cells. PMID:27285848
ERIC Educational Resources Information Center
Patrick, Susan McClendon
2013-01-01
The purpose of this study was to conduct a meta-analysis of dissertation research that examined the implementation of professional learning communities (PLCs) and student achievement in preK-12 schools. An exhaustive search for such unpublished studies was conducted using the following criteria: 1) the studies were available on dissertation…
ERIC Educational Resources Information Center
Türk, Cumhur; Kalkan, Hüseyin; Iskeleli', Nazan Ocak; Kiroglu, Kasim
2016-01-01
The purpose of this study is to examine the effects of an astronomy summer project implemented in different learning activities on elementary school students, pre-service elementary teachers and in-service teachers' astronomy achievement and their attitudes to astronomy field. This study is the result of a five-day, three-stage, science school,…
ERIC Educational Resources Information Center
Coffey, Debra J.
2013-01-01
This dissertation uses data from the evaluation of a Striving Readers project to examine the associations between levels of implementation of different components of Scholastic's "READ 180" and student achievement as measured on the Iowa Test of Basic Skills (ITBS) reading assessment. The approach was hierarchical linear modeling using…
Decreased Surgical Site Infection Rate in Hysterectomy: Effect of a Gynecology-Specific Bundle.
Andiman, Sarah E; Xu, Xiao; Boyce, John M; Ludwig, Elizabeth M; Rillstone, Heidi R W; Desai, Vrunda B; Fan, Linda L
2018-06-01
We implemented a hysterectomy-specific surgical site infection prevention bundle after a higher-than-expected surgical site infection rate was identified at our institution. We evaluate how this bundle affected the surgical site infection rate, length of hospital stay, and 30-day postoperative readmission rate. This is a quality improvement study featuring retrospective analysis of a prospectively implemented, multidisciplinary team-designed surgical site infection prevention bundle that consisted of chlorhexidine-impregnated preoperative wipes, standardized aseptic surgical preparation, standardized antibiotic dosing, perioperative normothermia, surgical dressing maintenance, and direct feedback to clinicians when the protocol was breached. There were 2,099 hysterectomies completed during the 33-month study period. There were 61 surgical site infections (4.51%) in the pre-full bundle implementation period and 14 (1.87%) in the post-full bundle implementation period; we found a sustained reduction in the proportion of patients experiencing surgical site infection during the last 8 months of the study period. After adjusting for clinical characteristics, patients who underwent surgery after full implementation were less likely to develop a surgical site infection (adjusted odds ratio [OR] 0.46, P=.01) than those undergoing surgery before full implementation. Multivariable regression analysis showed no statistically significant difference in postoperative days of hospital stay (adjusted mean ratio 0.95, P=.09) or rate of readmission for surgical site infection-specific indication (adjusted OR 2.65, P=.08) between the before and after full-bundle implementation periods. The multidisciplinary implementation of a gynecologic perioperative surgical site infection prevention bundle was associated with a significant reduction in surgical site infection rate in patients undergoing hysterectomy.
Automatic Near-Real-Time Image Processing Chain for Very High Resolution Optical Satellite Data
NASA Astrophysics Data System (ADS)
Ostir, K.; Cotar, K.; Marsetic, A.; Pehani, P.; Perse, M.; Zaksek, K.; Zaletelj, J.; Rodic, T.
2015-04-01
In response to the increasing need for automatic and fast satellite image processing SPACE-SI has developed and implemented a fully automatic image processing chain STORM that performs all processing steps from sensor-corrected optical images (level 1) to web-delivered map-ready images and products without operator's intervention. Initial development was tailored to high resolution RapidEye images, and all crucial and most challenging parts of the planned full processing chain were developed: module for automatic image orthorectification based on a physical sensor model and supported by the algorithm for automatic detection of ground control points (GCPs); atmospheric correction module, topographic corrections module that combines physical approach with Minnaert method and utilizing anisotropic illumination model; and modules for high level products generation. Various parts of the chain were implemented also for WorldView-2, THEOS, Pleiades, SPOT 6, Landsat 5-8, and PROBA-V. Support of full-frame sensor currently in development by SPACE-SI is in plan. The proposed paper focuses on the adaptation of the STORM processing chain to very high resolution multispectral images. The development concentrated on the sub-module for automatic detection of GCPs. The initially implemented two-step algorithm that worked only with rasterized vector roads and delivered GCPs with sub-pixel accuracy for the RapidEye images, was improved with the introduction of a third step: super-fine positioning of each GCP based on a reference raster chip. The added step exploits the high spatial resolution of the reference raster to improve the final matching results and to achieve pixel accuracy also on very high resolution optical satellite data.
Tomić, Sinisa; Sucić, Anita Filipović; Martinac, Adrijana Ilić
2010-01-01
European legislation for medicines places the emphasis on an assessment of quality, safety and efficacy during the procedure for the granting of marketing authorisations for medicines, in order to protect patient health. The integrated European regulatory system involves the participation of a network of experts from the agencies of the member states that takes part in the European procedures for the authorisation of medicines. On the way to full membership in the EU, candidate countries and potential candidates have to transpose and implement the European directives for medicinal products; they must also strengthen their scientific and administrative capacities. Croatia acquired good experience in implementing the simplified marketing authorisation procedure for medicines authorised in the EU pursuant to the New Collaboration Agreement between Drug Regulatory Authorities in Central and East European Countries (nCADREAC), which helps it to exchange information and prepare for the implementation of European procedures. However, there are still some provisions to transpose before actual full membership, and also dossier upgrading, in which the marketing authorisation holder has to harmonise its documentation about a medicinal product with the requirements of the directives, if a product already on the market was not previously approved in line with current European legislation. Collaboration with the European Medicines Agency (EMA) through an Instrument for Pre-Accession (IPA) provides candidate countries and potential candidates the opportunity for education and training in some regulatory activities as well as the participation of their representatives as observers in some EMA committees and working groups. Some characteristics of the national regulatory frameworks of the countries of South East Europe in their efforts to achieve harmonisation with EU legislation are presented in this paper. Copyright 2010 Elsevier Inc. All rights reserved.
25 CFR 1000.408 - Can a Tribe/Consortium use Federal supply sources in the performance of an AFA?
Code of Federal Regulations, 2010 CFR
2010-04-01
... INDIAN SELF-DETERMINATION AND EDUCATION ACT Miscellaneous Provisions Supply Sources § 1000.408 Can a... Tribe/Consortium to resolve any barriers to full implementation that may arise. While implementation of... assist the Tribes/Consortia to resolve any barriers to full implementation that may arise to the fullest...
1998-05-22
This Notice of Proposed Rulemaking (NPRM) is an important step in the Commission's effort to increase the accessibility of telecommunications services and equipment to Americans with disabilities. The NPRM proposes a framework for implementing section 255 of the Communications Act of 1934 (Act), which requires telecommunications equipment manufacturers and service providers to ensure that their equipment and services are accessible to persons with disabilities, to the extent it is readily achievable to do so. In addition, if accessibility is not readily achievable, section 255 requires manufacturers and service providers to ensure compatibility with existing peripheral devices or specialized customer premises equipment commonly used by individuals with disabilities to achieve access, to the extent it is readily achievable to do so. The NPRM first explores the Commission's legal authority to establish rules implementing section 255. The NPRM then seeks comment on the interpretation of specific statutory terms that are relevant to the proceeding. Finally, the NPRM seeks comment on proposals to implement and enforce the requirement that telecommunications equipment and services be made accessible to the extent readily achievable. The actions proposed in the NPRM are needed to ensure that people with disabilities are not left behind in the telecommunications revolution and consequently isolated from contemporary life.
Black, Robert E; Taylor, Carl E; Arole, Shobha; Bang, Abhay; Bhutta, Zulfiqar A; Chowdhury, A Mushtaque R; Kirkwood, Betty R; Kureshy, Nazo; Lanata, Claudio F; Phillips, James F; Taylor, Mary; Victora, Cesar G; Zhu, Zonghan; Perry, Henry B
2017-01-01
Background The contributions that community–based primary health care (CBPHC) and engaging with communities as valued partners can make to the improvement of maternal, neonatal and child health (MNCH) is not widely appreciated. This unfortunate reality is one of the reasons why so few priority countries failed to achieve the health–related Millennium Development Goals by 2015. This article provides a summary of a series of articles about the effectiveness of CBPHC in improving MNCH and offers recommendations from an Expert Panel for strengthening CBPHC that were formulated in 2008 and have been updated on the basis of more recent evidence. Methods An Expert Panel convened to guide the review of the effectiveness of community–based primary health care (CBPHC). The Expert Panel met in 2008 in New York City with senior UNICEF staff. In 2016, following the completion of the review, the Panel considered the review’s findings and made recommendations. The review consisted of an analysis of 661 unique reports, including 583 peer–reviewed journal articles, 12 books/monographs, 4 book chapters, and 72 reports from the gray literature. The analysis consisted of 700 assessments since 39 were analyzed twice (once for an assessment of improvements in neonatal and/or child health and once for an assessment in maternal health). Results The Expert Panel recommends that CBPHC should be a priority for strengthening health systems, accelerating progress in achieving universal health coverage, and ending preventable child and maternal deaths. The Panel also recommends that expenditures for CBPHC be monitored against expenditures for primary health care facilities and hospitals and reflect the importance of CBPHC for averting mortality. Governments, government health programs, and NGOs should develop health systems that respect and value communities as full partners and work collaboratively with them in building and strengthening CBPHC programs – through engagement with planning, implementation (including the full use of community–level workers), and evaluation. CBPHC programs need to reach every community and household in order to achieve universal coverage of key evidence–based interventions that can be implemented in the community outside of health facilities and assure that those most in need are reached. Conclusions Stronger CBPHC programs that foster community engagement/empowerment with the implementation of evidence–based interventions will be essential for achieving universal coverage of health services by 2030 (as called for by the Sustainable Development Goals recently adopted by the United Nations), ending preventable child and maternal deaths by 2030 (as called for by the World Health Organization, UNICEF, and many countries around the world), and eventually achieving Health for All as envisioned at the International Conference on Primary Health Care in 1978. Stronger CBPHC programs can also create entry points and synergies for expanding the coverage of family planning services as well as for accelerating progress in the detection and treatment of HIV/AIDS, tuberculosis, malaria, hypertension, and other chronic diseases. Continued strengthening of CBPHC programs based on rigorous ongoing operations research and evaluation will be required, and this evidence will be needed to guide national and international policies and programs. PMID:28685046
NASA Astrophysics Data System (ADS)
Mekarina, M.; Ningsih, Y. P.
2017-09-01
This classroom action research is based by the facts that the students motivation and achievement mathematics learning is less. One of the factors causing is learning that does not provide flexibility to students to empower the potential of the brain optimally. The aim of this research was to improve the student motivation and achievement in mathematics learning by implementing brain based learning approach. The subject of this research was student of grade XI in senior high school. The research consisted of two cycles. Data of student achievement from test, and the student motivation through questionnaire. Furthermore, the finding of this research showed the result of the analysis was the implementation of brain based learning approach can improve student’s achievement and motivation in mathematics learning.
O'Campo, Patricia; Zerger, Suzanne; Gozdzik, Agnes; Jeyaratnam, Jeyagobi; Stergiopoulos, Vicky
2015-05-01
The importance of program implementation in achieving desired outcomes is well-documented, but there remains a need for concrete guidance on how to achieve fidelity to evidence-based models within dynamic local contexts. Housing First (HF), an evidence-based model for people experiencing homelessness and mental illness, provides an important test-case for such guidance; it targets a uniquely underserved subpopulation with complex needs, and is delivered by practitioners with varying knowledge and skill levels. Scientific evidence affirms HF's effectiveness, but its rapid dissemination has outpaced the ability to monitor not only whether it is being implemented with fidelity, but also how this can be achieved within variable local contexts and challenges. This qualitative study contributes to this need by capturing insights from practitioners on implementation challenges and specific strategies developed to overcome them. Findings reinforce the importance of developing HF-specific implementation guidelines, and of engaging relevant stakeholders throughout all phases of that development.
Goldstein, Mandy; Murray, Stuart B; Griffiths, Scott; Rayner, Kathryn; Podkowka, Jessica; Bateman, Joel E; Wallis, Andrew; Thornton, Christopher E
2016-11-01
Anorexia nervosa (AN) is a severe psychiatric illness with little evidence supporting treatment in adults. Among adolescents with AN, family-based treatment (FBT) is considered first-line outpatient approach, with a growing evidence base. However, research on FBT has stemmed from specialist services in research/public health settings. This study investigated the effectiveness of FBT in a case series of adolescent AN treated in a private practice setting. Thirty-four adolescents with full or partial AN, diagnosed according to DSM-IV criteria, participated, and were assessed at pretreatment and post-treatment. Assessments included change in % expected body weight, mood, and eating pathology. Significant weight gain was observed from pretreatment to post-treatment. 45.9% of the sample demonstrated full weight restoration and a further 43.2% achieved partial weight-based remission. Missing data precluded an examination of change in mood and ED psychopathology. Effective dissemination across different service types is important to the wider availability of evidence-based treatments. These weight restoration data lend preliminary support to the implementation of FBT in real world treatment settings. © 2016 Wiley Periodicals, Inc. (Int J Eat Disord 2016; 49:1023-1026). © 2016 Wiley Periodicals, Inc.
Dynamic SVL and body bias for low leakage power and high performance in CMOS digital circuits
NASA Astrophysics Data System (ADS)
Deshmukh, Jyoti; Khare, Kavita
2012-12-01
In this article, a new complementary metal oxide semiconductor design scheme called dynamic self-controllable voltage level (DSVL) is proposed. In the proposed scheme, leakage power is controlled by dynamically disconnecting supply to inactive blocks and adjusting body bias to further limit leakage and to maintain performance. Leakage power measurements at 1.8 V, 75°C demonstrate power reduction by 59.4% in case of 1 bit full adder and by 43.0% in case of a chain of four inverters using SVL circuit as a power switch. Furthermore, we achieve leakage power reduction by 94.7% in case of 1 bit full adder and by 91.8% in case of a chain of four inverters using dynamic body bias. The forward body bias of 0.45 V applied in active mode improves the maximum operating frequency by 16% in case of 1 bit full adder and 5.55% in case of a chain of inverters. Analysis shows that additional benefits of using the DSVL and body bias include high performance, low leakage power consumption in sleep mode, single threshold implementation and state retention even in standby mode.
McLaren, Susan; Woods, Leslie; Boudioni, Markella; Lemma, Ferew; Tavabie, Abdol
2008-01-01
To identify and explore leadership roles and responsibilities for implementing the workforce development strategy; to identify approaches used to implement and disseminate the strategy; and to identify and explore challenges and achievements in the first 18 months following implementation. A formative evaluation with qualitative methods was used. Documentary analysis, interviews (n = 29) and two focus groups (n = 12) were conducted with a purposive sample of individuals responsible for strategy implementation. Data were transcribed and analysed thematically using framework analysis. Regional health area in Kent, Surrey and Sussex: 24 primary care trusts (PCTs) and 900 general practices. Primary care workforce tutors, lifelong learning advisors, GP tutors, patch associate GP deans and chairs of PCT education committees all had vital leadership roles, some existing and others newly developed. Approaches used to implement the strategy encompassed working within and across organisational boundaries, communication and dissemination of information. Challenges encountered by implementers were resistance to change - evident in some negative attitudes to uptake of training and development opportunities - and role diversity and influence. Achievements included successes in embedding appraisal and protected learning time, and changes in educational practices and services. The use of key leadership roles and change-management approaches had brought about early indications of positive transition in lifelong learning cultures.
NASA Astrophysics Data System (ADS)
Cao, Jiashun; Oleyiblo, Oloche James; Xue, Zhaoxia; Otache, Y. Martins; Feng, Qian
2015-07-01
Two mathematical models were used to optimize the performance of a full-scale biological nutrient removal (BNR) activated treatment plant, a plug-flow bioreactors operated in a 3-stage phoredox process configuration, anaerobic anoxic oxic (A2/O). The ASM2d implemented on the platform of WEST2011 software and the BioWin activated sludge/anaerobic digestion (AS/AD) models were used in this study with the aim of consistently achieving the designed effluent criteria at a low operational cost. Four ASM2d parameters (the reduction factor for denitrification , the maximum growth rate of heterotrophs (µH), the rate constant for stored polyphosphates in PAOs ( q pp), and the hydrolysis rate constant ( k h)) were adjusted. Whereas three BioWin parameters (aerobic decay rate ( b H), heterotrophic dissolved oxygen (DO) half saturation ( K OA), and Y P/acetic) were adjusted. Calibration of the two models was successful; both models have average relative deviations (ARD) less than 10% for all the output variables. Low effluent concentrations of nitrate nitrogen (N-NO3), total nitrogen (TN), and total phosphorus (TP) were achieved in a full-scale BNR treatment plant having low influent chemical oxygen demand (COD) to total Kjeldahl nitrogen (TKN) ratio (COD/TKN). The effluent total nitrogen and nitrate nitrogen concentrations were improved by 50% and energy consumption was reduced by approximately 25%, which was accomplished by converting the two-pass aerobic compartment of the plug-flow bioreactor to anoxic reactors and being operated in an alternating mode. Findings in this work are helpful in improving the operation of wastewater treatment plant while eliminating the cost of external carbon source and reducing energy consumption.
Production of EUV mask blanks with low killer defects
NASA Astrophysics Data System (ADS)
Antohe, Alin O.; Kearney, Patrick; Godwin, Milton; He, Long; John Kadaksham, Arun; Goodwin, Frank; Weaver, Al; Hayes, Alan; Trigg, Steve
2014-04-01
For full commercialization, extreme ultraviolet lithography (EUVL) technology requires the availability of EUV mask blanks that are free of defects. This remains one of the main impediments to the implementation of EUV at the 22 nm node and beyond. Consensus is building that a few small defects can be mitigated during mask patterning, but defects over 100 nm (SiO2 equivalent) in size are considered potential "killer" defects or defects large enough that the mask blank would not be usable. The current defect performance of the ion beam sputter deposition (IBD) tool will be discussed and the progress achieved to date in the reduction of large size defects will be summarized, including a description of the main sources of defects and their composition.
Monolithic integration of GMR sensors for standard CMOS-IC current sensing
NASA Astrophysics Data System (ADS)
De Marcellis, A.; Reig, C.; Cubells-Beltrán, M.-D.; Madrenas, J.; Santos, J. D.; Cardoso, S.; Freitas, P. P.
2017-09-01
In this work we report on the development of Giant Magnetoresistive (GMR) sensors for off-line current measurements in standard integrated circuits. An ASIC has been specifically designed and fabricated in the well-known AMS-0.35 μm CMOS technology, including the electronic circuitry for sensor interfacing. It implements an oscillating circuit performing a voltage-to-frequency conversion. Subsequently, a fully CMOS-compatible low temperature post-process has been applied for depositing the GMR sensing devices in a full-bridge configuration onto the buried current straps. Sensitivity and resolution of these sensors have been investigated achieving experimental results that show a detection sensitivity of about 100 Hz/mA, with a resolution of about 5 μA.
[The experience of public guarantees of free-of-charge medical care foreign countries].
Ulumbekova, G E
2010-01-01
The article deals with the analysis of the volumes of financing of public guarantees program of free-of-charge medical care and its algorithm of its elaboration in foreign countries. In the advanced countries, the higher financing of public health permit to ensure factually overall population the full free-of-charge spectrum of up-to-date medical interventions as a "public guarantees pack". It includes the pharmaceuticals supply in outpatient conditions and in most cases the long-term care services. In economically advanced countries, the general trend is the transfer from fundamental principles ("everything needed") to the more transparent approaches in case of implementation of the guarantees to achieve the balance between actual financial resources and stated population guarantees.
A pivotal-based approach for enterprise business process and IS integration
NASA Astrophysics Data System (ADS)
Ulmer, Jean-Stéphane; Belaud, Jean-Pierre; Le Lann, Jean-Marc
2013-02-01
A company must be able to describe and react against any endogenous or exogenous event. Such flexibility can be achieved through business process management (BPM). Nevertheless a BPM approach highlights complex relations between business and IT domains. A non-alignment is exposed between heterogeneous models: this is the 'business-IT gap' as described in the literature. Through concepts from business engineering and information systems driven by models and IT, we define a generic approach ensuring multi-view consistency. Its role is to maintain and provide all information related to the structure and semantic of models. Allowing the full return of a transformed model in the sense of reverse engineering, our platform enables synchronisation between analysis model and implementation model.
Design and implementation of expert decision system in Yellow River Irrigation
NASA Astrophysics Data System (ADS)
Fuping, Wang; Bingbing, Lei; Jie, Pan
2018-03-01
How to make full use of water resources in the Yellow River irrigation is a problem needed to be solved urgently. On account of the different irrigation strategies in various growth stages of wheat, this paper proposes a novel irrigation expert decision system basing on fuzzy control technique. According to the control experience, expert knowledge and MATLAB simulation optimization, we obtain the irrigation fuzzy control table stored in the computer memory. The controlling irrigation is accomplished by reading the data from fuzzy control table. The experimental results show that the expert system can be used in the production of wheat to achieve timely and appropriate irrigation, and ensure that wheat growth cycle is always in the best growth environment.
Achieving Energy Efficiency Through Real-Time Feedback
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nesse, Ronald J.
2011-09-01
Through the careful implementation of simple behavior change measures, opportunities exist to achieve strategic gains, including greater operational efficiencies, energy cost savings, greater tenant health and ensuing productivity and an improved brand value through sustainability messaging and achievement.
ERIC Educational Resources Information Center
Ross, Steven M.; Nunnery, John A.; Goldfeder, Elizabeth; McDonald, Aaron; Rachor, Robert; Hornbeck, Matthew; Fleischman, Steve
2004-01-01
This research examined the effectiveness in an urban school district of 2 of the most widely used Comprehensive School Reform (CSR) programs-Direct Instruction (DI), implemented in 9 district elementary schools, and Success for All (SFA), implemented in 2 elementary schools. In examining impacts on student achievement and school change outcomes…
Charles R. Blinn; Michael A. Thompson
1996-01-01
Contains a variety of papers presented at the joint meeting of the Council on Forest Engineering and International Union of Forest Research Organizations Subject Group S3.04 and that support the meeting theme "Planning and Implementing Forest Operations to Achieve Sustainable Forests."
NASA Astrophysics Data System (ADS)
Wahyu, W.; Kurnia; Syaadah, R. S.
2018-05-01
The purpose of study was to investigate the implementation of PBL to improve student’s academic achievement and creativity on the topic of electrolyte and non-electrolye solutions. This study was conducted as a descriptive method with case study design. Subject of this study consisted of 30 students in the class X. Instruments used in the study included tests and observation sheets. Student’s achievement changes is calculated using N-gain formula, hereafter, the data that have been processed then was analyzed descriptively. The results showed that generally academic achievement and creativity of students has increased as indicated by the value of N-gain (0.667; 0.656). The results of the study also showed that there was a correlation with the moderate category between the academic achievement and the student’s creative thinking as indicated by (r = 0.413), meanwhile, the relationship between academic achievement and creativity(r = 0.340) that belongs to the weak category. Implementation of PBL had a good response from students with percentage 80.3%. Based on these findings, PBL is recommended to be applied on the learning process for other chemistry topics that suitable in term of characteristics between learning materials and PBL stages in order to develop academic achievement and creativity of students.
Falloon, Ian RH; Montero, Isabel; Sungur, Mehmet; Mastroeni, Antonino; Malm, Ulf; Economou, Marina; Grawe, Rolf; Harangozo, Judit; Mizuno, Masafumi; Murakami, Masaaki; Hager, Bert; Held, Tilo; Veltro, Franco; Gedye, Robyn
2004-01-01
According to clinical trials literature, every person with a schizophrenic disorder should be provided with the combination of optimal dose antipsychotics, strategies to educate himself and his carers to cope more efficiently with environmental stresses, cognitive-behavioural strategies to enhance work and social goals and reducing residual symptoms, and assertive home-based management to help prevent and resolve major social needs and crises, including recurrent episodes of symptoms. Despite strong scientific support for the routine implementation of these 'evidence-based' strategies, few services provide more than the pharmacotherapy component, and even this is seldom applied in the manner associated with the best results in the clinical trials. An international collaborative group, the Optimal Treatment Project (OTP), has been developed to promote the routine use of evidence-based strategies for schizophrenic disorders. A field trial was started to evaluate the benefits and costs of applying evidence-based strategies over a 5-year period. Centres have been set up in 18 countries. This paper summarises the outcome after 24 months of 'optimal' treatment in 603 cases who had reached this stage in their treatment by the end of 2002. On all measures the evidence-based OTP approach achieved more than double the benefits associated with current best practices. One half of recent cases had achieved full recovery from clinical and social morbidity. These advantages were even more striking in centres where a random-control design was used. PMID:16633471
Falloon, Ian R H; Montero, Isabel; Sungur, Mehmet; Mastroeni, Antonino; Malm, Ulf; Economou, Marina; Grawe, Rolf; Harangozo, Judit; Mizuno, Masafumi; Murakami, Masaaki; Hager, Bert; Held, Tilo; Veltro, Franco; Gedye, Robyn
2004-06-01
According to clinical trials literature, every person with a schizophrenic disorder should be provided with the combination of optimal dose antipsychotics, strategies to educate himself and his carers to cope more efficiently with environmental stresses, cognitive-behavioural strategies to enhance work and social goals and reducing residual symptoms, and assertive home-based management to help prevent and resolve major social needs and crises, including recurrent episodes of symptoms. Despite strong scientific support for the routine implementation of these 'evidence-based' strategies, few services provide more than the pharmacotherapy component, and even this is seldom applied in the manner associated with the best results in the clinical trials. An international collaborative group, the Optimal Treatment Project (OTP), has been developed to promote the routine use of evidence-based strategies for schizophrenic disorders. A field trial was started to evaluate the benefits and costs of applying evidence-based strategies over a 5-year period. Centres have been set up in 18 countries. This paper summarises the outcome after 24 months of 'optimal' treatment in 603 cases who had reached this stage in their treatment by the end of 2002. On all measures the evidence-based OTP approach achieved more than double the benefits associated with current best practices. One half of recent cases had achieved full recovery from clinical and social morbidity. These advantages were even more striking in centres where a random-control design was used.
NASA Astrophysics Data System (ADS)
Zhu, Z.; Bi, J.; Wang, X.; Zhu, W.
2014-02-01
As an important sub-topic of the natural process of carbon emission data public information platform construction, coalfield spontaneous combustion of carbon emission WebGIS system has become an important study object. In connection with data features of coalfield spontaneous combustion carbon emissions (i.e. a wide range of data, which is rich and complex) and the geospatial characteristics, data is divided into attribute data and spatial data. Based on full analysis of the data, completed the detailed design of the Oracle database and stored on the Oracle database. Through Silverlight rich client technology and the expansion of WCF services, achieved the attribute data of web dynamic query, retrieval, statistical, analysis and other functions. For spatial data, we take advantage of ArcGIS Server and Silverlight-based API to invoke GIS server background published map services, GP services, Image services and other services, implemented coalfield spontaneous combustion of remote sensing image data and web map data display, data analysis, thematic map production. The study found that the Silverlight technology, based on rich client and object-oriented framework for WCF service, can efficiently constructed a WebGIS system. And then, combined with ArcGIS Silverlight API to achieve interactive query attribute data and spatial data of coalfield spontaneous emmission, can greatly improve the performance of WebGIS system. At the same time, it provided a strong guarantee for the construction of public information on China's carbon emission data.
Federal Parity and Access to Behavioral Health Care in Private Health Plans.
Hodgkin, Dominic; Horgan, Constance M; Stewart, Maureen T; Quinn, Amity E; Creedon, Timothy B; Reif, Sharon; Garnick, Deborah W
2018-04-01
The 2008 Mental Health Parity and Addiction Equity Act (MHPAEA) sought to improve access to behavioral health care by regulating health plans' coverage and management of services. Health plans have some discretion in how to achieve compliance with MHPAEA, leaving questions about its likely effects on health plan policies. In this study, the authors' objective was to determine how private health plans' coverage and management of behavioral health treatment changed after the federal parity law's full implementation. A nationally representative survey of commercial health plans was conducted in 60 market areas across the continental United States, achieving response rates of 89% in 2010 (weighted N=8,431) and 80% in 2014 (weighted N=6,974). Senior executives at responding plans were interviewed regarding behavioral health services in each year and (in 2014) regarding changes. Student's t tests were used to examine changes in services covered, cost-sharing, and prior authorization requirements for both behavioral health and general medical care. In 2014, 68% of insurance products reported having expanded behavioral health coverage since 2010. Exclusion of eating disorder coverage was eliminated between 2010 (23%) and 2014 (0%). However, more products reported excluding autism treatment in 2014 (24%) than 2010 (8%). Most plans reported no change to prior-authorization requirements between 2010 and 2014. Implementation of federal parity legislation appears to have been accompanied by continuing improvement in behavioral health coverage. The authors did not find evidence of widespread noncompliance or of unintended effects, such as dropping coverage of behavioral health care altogether.
NASA Astrophysics Data System (ADS)
Radestock, Martin; Rose, Michael; Monner, Hans Peter
2017-04-01
In most aviation applications, a major cost benefit can be achieved by a reduction of the system weight. Often the acoustic properties of the fuselage structure are not in the focus of the primary design process, too. A final correction of poor acoustic properties is usually done using insulation mats in the chamber between the primary and secondary shell. It is plausible that a more sophisticated material distribution in that area can result in a substantially reduced weight. Topology optimization is a well-known approach to reduce material of compliant structures. In this paper an adaption of this method to acoustic problems is investigated. The gap full of insulation mats is suitably parameterized to achieve different material distributions. To find advantageous configurations, the objective in the underlying topology optimization is chosen to obtain good acoustic pressure patterns in the aircraft cabin. An important task in the optimization is an adequate Finite Element model of the system. This can usually not be obtained from commercially available programs due to the lack of special sensitivity data with respect to the design parameters. Therefore an appropriate implementation of the algorithm has been done, exploiting the vector and matrix capabilities in the MATLABQ environment. Finally some new aspects of the Finite Element implementation will also be presented, since they are interesting on its own and can be generalized to efficiently solve other partial differential equations as well.
Generic calculation of two-body partial decay widths at the full one-loop level
NASA Astrophysics Data System (ADS)
Goodsell, Mark D.; Liebler, Stefan; Staub, Florian
2017-11-01
We describe a fully generic implementation of two-body partial decay widths at the full one-loop level in the SARAH and SPheno framework compatible with most supported models. It incorporates fermionic decays to a fermion and a scalar or a gauge boson as well as scalar decays into two fermions, two gauge bosons, two scalars or a scalar and a gauge boson. We present the relevant generic expressions for virtual and real corrections. Whereas wave-function corrections are determined from on-shell conditions, the parameters of the underlying model are by default renormalised in a \\overline{ {DR}} (or \\overline{ {MS}}) scheme. However, the user can also define model-specific counter-terms. As an example we discuss the renormalisation of the electric charge in the Thomson limit for top-quark decays in the standard model. One-loop-induced decays are also supported. The framework additionally allows the addition of mass and mixing corrections induced at higher orders for the involved external states. We explain our procedure to cancel infrared divergences for such cases, which is achieved through an infrared counter-term taking into account corrected Goldstone boson vertices. We compare our results for sfermion, gluino and Higgs decays in the minimal supersymmetric standard model (MSSM) against the public codes SFOLD, FVSFOLD and HFOLD and explain observed differences. Radiatively induced gluino and neutralino decays are compared against the original implementation in SPheno in the MSSM. We exactly reproduce the results of the code CNNDecays for decays of neutralinos and charginos in R-parity violating models. The new version SARAH 4.11.0 by default includes the calculation of two-body decay widths at the full one-loop level. Current limitations for certain model classes are described.
NASA Astrophysics Data System (ADS)
Wang, Yonggang; Liu, Chong
2016-10-01
The common solution for a field programmable gate array (FPGA)-based time-to-digital converter (TDC) is constructing a tapped delay line (TDL) for time interpolation to yield a sub-clock time resolution. The granularity and uniformity of the delay elements of TDL determine the TDC time resolution. In this paper, we propose a dual-sampling TDL architecture and a bin decimation method that could make the delay elements as small and uniform as possible, so that the implemented TDCs can achieve a high time resolution beyond the intrinsic cell delay. Two identical full hardware-based TDCs were implemented in a Xilinx UltraScale FPGA for performance evaluation. For fixed time intervals in the range from 0 to 440 ns, the average time-interval RMS resolution is measured by the two TDCs with 4.2 ps, thus the timestamp resolution of single TDC is derived as 2.97 ps. The maximum hit rate of the TDC is as high as half the system clock rate of FPGA, namely 250 MHz in our demo prototype. Because the conventional online bin-by-bin calibration is not needed, the implementation of the proposed TDC is straightforward and relatively resource-saving.
Kant, Nasir Ali; Dar, Mohamad Rafiq; Khanday, Farooq Ahmad
2015-01-01
The output of every neuron in neural network is specified by the employed activation function (AF) and therefore forms the heart of neural networks. As far as the design of artificial neural networks (ANNs) is concerned, hardware approach is preferred over software one because it promises the full utilization of the application potential of ANNs. Therefore, besides some arithmetic blocks, designing AF in hardware is the most important for designing ANN. While attempting to design the AF in hardware, the designs should be compatible with the modern Very Large Scale Integration (VLSI) design techniques. In this regard, the implemented designs should: only be in Metal Oxide Semiconductor (MOS) technology in order to be compatible with the digital designs, provide electronic tunability feature, and be able to operate at ultra-low voltage. Companding is one of the promising circuit design techniques for achieving these goals. In this paper, 0.5 V design of Liao's AF using sinh-domain technique is introduced. Furthermore, the function is tested by implementing inertial neuron model. The performance of the AF and inertial neuron model have been evaluated through simulation results, using the PSPICE software with the MOS transistor models provided by the 0.18-μm Taiwan Semiconductor Manufacturer Complementary Metal Oxide Semiconductor (TSM CMOS) process.
Zozaya, Carlos; Triana, Miryam; Madero, Rosario; Abrams, Steven; Martinez, Leopoldo; Amesty, Maria Virginia; Pipaón, Miguel Sáenz de
2017-10-01
Introduction The objective of the study is to examine the factors associated with time to achieve full enteral feeding after repair of congenital diaphragmatic hernia. Materials and Methods Demographic, clinical, and therapeutic data were retrospectively assessed, and uni- and multivariate Cox regression were performed to examine factors predictive of achieving full enteral feeding that was defined as time to achieve120 mL/kg/d after surgical repair. Results Of 78 infants, 66 underwent intervention before hospital discharge. All infants who survived had reached full enteral feeding at the time of hospital discharge by a median of 22 days (range: 2-119 days) after surgery and 10 days (range: 1-91) after initiation of postoperative enteral feedings. Independent risk factors associated with a longer time to reach full enteral feeding achievement included gastroesophageal reflux and days of antibiotics in the postoperative period. Daily stool passage preoperatively predicted earlier enteral tolerance. Conclusion Infants who survive congenital diaphragmatic hernia generally are able to achieve full enteral feedings after surgical repair. A longer time to full feeding is needed in the most severe cases, but some specific characteristics can be used to help identify patients at higher risk. Although some of these characteristics are unavoidable, others including rational antibiotic usage and active gastroesophageal reflux prevention and treatment are feasible and may improve enteral tolerance. Georg Thieme Verlag KG Stuttgart · New York.
Surgeon Reimbursements in Maxillofacial Trauma Surgery: Effect of the Affordable Care Act in Ohio.
Khansa, Ibrahim; Khansa, Lara; Pearson, Gregory D
2016-02-01
Surgical treatment of maxillofacial injuries has historically been associated with low reimbursements, mainly because of the high proportion of uninsured patients. The Affordable Care Act, implemented in January of 2014, aimed to reduce the number of uninsured. If the Affordable Care Act achieves this goal, surgeons may benefit from improved reimbursement rates. The authors' purpose was to evaluate the effects of the Affordable Care Act on payor distribution and surgeon reimbursements for maxillofacial trauma surgery at their institution. A review of all patients undergoing surgery for maxillofacial trauma between January of 2012 and December of 2014 was conducted. Insurance status, and amounts billed and collected by the surgeon, were recorded. Patients treated before implementation of the Affordable Care Act were compared to those treated after. Five hundred twenty-three patients were analyzed. Three hundred thirty-four underwent surgery before implementation of the Affordable Care Act, and 189 patients underwent surgery after. After implementation of the Affordable Care Act, the proportion of uninsured decreased (27.2 percent to 11.1 percent; p < 0.001) and the proportion of patients on Medicaid increased (7.8 percent to 25.4 percent; p < 0.001). Overall surgeon reimbursement rate increased from 14.3 percent to 19.8 percent (p < 0.001). After implementation of the Affordable Care Act, we observed a significant reduction in the proportion of maxillofacial trauma patients who were uninsured. Surgeons' overall reimbursement rate increased. These trends should be followed over a longer term to determine the full effect of the Affordable Care Act.
Large-scale seismic waveform quality metric calculation using Hadoop
Magana-Zook, Steven; Gaylord, Jessie M.; Knapp, Douglas R.; ...
2016-05-27
Here in this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of ~0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/Omore » performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. We conducted these experiments multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will likely require significant changes in other parts of our infrastructure. Nevertheless, we anticipate that as the technology matures and third-party tool vendors make it easier to manage and operate clusters, Hadoop (or a successor) will play a large role in our seismic data processing.« less
Large-scale seismic waveform quality metric calculation using Hadoop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magana-Zook, Steven; Gaylord, Jessie M.; Knapp, Douglas R.
Here in this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of ~0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/Omore » performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. We conducted these experiments multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will likely require significant changes in other parts of our infrastructure. Nevertheless, we anticipate that as the technology matures and third-party tool vendors make it easier to manage and operate clusters, Hadoop (or a successor) will play a large role in our seismic data processing.« less
Algorithmic synthesis using Python compiler
NASA Astrophysics Data System (ADS)
Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej
2015-09-01
This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.
Azrin, Susan T; Huskamp, Haiden A; Azzone, Vanessa; Goldman, Howard H; Frank, Richard G; Burnam, M Audrey; Normand, Sharon-Lise T; Ridgely, M Susan; Young, Alexander S; Barry, Colleen L; Busch, Alisa B; Moran, Garrett
2007-02-01
The Federal Employees Health Benefits Program implemented full mental health and substance abuse parity in January 2001. Evaluation of this policy revealed that parity increased adult beneficiaries' financial protection by lowering mental health and substance abuse out-of-pocket costs for service users in most plans studied but did not increase rates of service use or spending among adult service users. This study examined the effects of full mental health and substance abuse parity for children. Employing a quasiexperimental design, we compared children in 7 Federal Employees Health Benefits plans from 1999 to 2002 with children in a matched set of plans that did not have a comparable change in mental health and substance abuse coverage. Using a difference-in-differences analysis, we examined the likelihood of child mental health and substance abuse service use, total spending among child service users, and out-of-pocket spending. The apparent increase in the rate of children's mental health and substance abuse service use after implementation of parity was almost entirely due to secular trends of increased service utilization. Estimates for children's mental health and substance abuse spending conditional on this service use showed significant decreases in spending per user attributable to parity for 2 plans; spending estimates for the other plans were not statistically significant. Children using these services in 3 of 7 plans experienced statistically significant reductions in out-of-pocket spending attributable to the parity policy, and the average dollar savings was sizeable for users in those 3 plans. In the remaining 4 plans, out-of-pocket spending also decreased, but these decreases were not statistically significant. Full mental health and substance abuse parity for children, within the context of managed care, can achieve equivalence of benefits in health insurance coverage and improve financial protection without adversely affecting health care costs but may not expand access for children who need these services.
DOT National Transportation Integrated Search
2009-11-01
Highway agencies across the nation are moving towards implementation of the new AASHTO Mechanistic-Empirical Pavement Design Guide (MEPDG) for pavement design. The benefits of implementing the MEPDG for routine use in Ohio includes (1) achieving more...
Programme Implementation in Social and Emotional Learning: Basic Issues and Research Findings
ERIC Educational Resources Information Center
Durlak, Joseph A.
2016-01-01
This paper discusses the fundamental importance of achieving quality implementation when assessing the impact of social and emotional learning interventions. Recent findings in implementation science are reviewed that include a definition of implementation, its relation to programme outcomes, current research on the factors that affect…
[Implementation of clinical practice guidelines: how can we close the evidence-practice gap?].
Muche-Borowski, Cathleen; Nothacker, M; Kopp, I
2015-01-01
Guidelines are intended as instruments of knowledge transfer to support decision-making by physicians, other health professionals and patients in clinical practice and thereby contribute to quality improvements in healthcare. To date they are an indispensable tool for healthcare. Their benefit for patients can only be seen in application, i.e. the implementation of guideline recommendations. For successful implementation, implementability and practicability play a crucial role and these characteristics can be influenced and should be promoted by the guideline development group. In addition, a force field analysis to identify barriers against and facilitators for the implementation of specific guideline recommendations from the perspective of physicians and patients is recommended to guide the development of an individual implementation strategy and the selection of appropriate interventions. However, implementation cannot be achieved by the guideline development group alone and a universal implementation strategy does not exist. Therefore, a process using theory, analysis, experience and shared responsibility of stakeholders in healthcare is recommended, with the aim to achieve sustainable behavioral change and improve the quality of care by guideline-oriented behavior.
ISO 50001 and SEP Faster and Cheaper - Exploring the Enterprise-Wide Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jingjing; Rao, Prakash; Therkelsen, Peter
ISO 50001 and other management systems (e.g., ISO 9001 and ISO 14001) allow for implementation and certification at the enterprise level. The "Central Office" concept, which allows a small group of employees to manage and facilitate the organization’s energy management system (EnMS) at the enterprise level, was introduced within the ISO 50003 standard to provide guidance to ISO 50001 certification bodies. Four industrial companies have partnered with the United States Department of Energy to pilot the enterprise-wide ISO 50001/SEP concept under the Better Buildings Superior Energy Performance (SEP) Enterprise-wide Accelerator. Each organization developed a Central Office to host their EnMSmore » while implementing ISO 50001/SEP at multiple physically separated sites. The four corporate partners tailored their Central Office implementation model to meet their own specific circumstances and needs. This paper reviews the commonalities, differences, and benefits of each of these enterprise-wide implementation models, including organizational structures, Central Office staff responsibilities, and key strategies. The cost savings and benefits of using the enterprise-wide approach were assessed, including the cost per site compared with that of a conventional, single-site ISO 50001/SEP implementation approach. This paper also discusses the drivers for the cost reductions realized through these enterprise-wide approaches. The four partner companies worked with 30 total sites. On average, these 30 sites improved energy performance by 5% annually over their SEP achievement periods, saved more than $600,000 annually in energy costs and reduced implementation cost for ISO 50001 and SEP by $19,000 and 0.8 Full Time Equivalent × years (FTE-yr) of staff time per site. The results can inform other organizations seeking to implement enterprise-wide ISO 50001/SEP, as well as energy efficiency organizations seeking to promote wider adoption of ISO 50001 implementation.« less
Lau, Rosa; Stevenson, Fiona; Ong, Bie Nio; Dziedzic, Krysia; Treweek, Shaun; Eldridge, Sandra; Everitt, Hazel; Kennedy, Anne; Qureshi, Nadeem; Rogers, Anne; Peacock, Richard; Murray, Elizabeth
2015-01-01
Objective To identify, summarise and synthesise available literature on the effectiveness of implementation strategies for optimising implementation of complex interventions in primary care. Design Systematic review of reviews. Data sources MEDLINE, EMBASE, CINAHL, Cochrane Library and PsychINFO were searched, from first publication until December 2013; the bibliographies of relevant articles were screened for additional reports. Eligibility criteria for selecting studies Eligible reviews had to (1) examine effectiveness of single or multifaceted implementation strategies, (2) measure health professional practice or process outcomes and (3) include studies from predominantly primary care in developed countries. Two reviewers independently screened titles/abstracts and full-text articles of potentially eligible reviews for inclusion. Data synthesis Extracted data were synthesised using a narrative approach. Results 91 reviews were included. The most commonly evaluated strategies were those targeted at the level of individual professionals, rather than those targeting organisations or context. These strategies (eg, audit and feedback, educational meetings, educational outreach, reminders) on their own demonstrated a small to modest improvement (2–9%) in professional practice or behaviour with considerable variability in the observed effects. The effects of multifaceted strategies targeted at professionals were mixed and not necessarily more effective than single strategies alone. There was relatively little review evidence on implementation strategies at the levels of organisation and wider context. Evidence on cost-effectiveness was limited and data on costs of different strategies were scarce and/or of low quality. Conclusions There is a substantial literature on implementation strategies aimed at changing professional practices or behaviour. It remains unclear which implementation strategies are more likely to be effective than others and under what conditions. Future research should focus on identifying and assessing the effectiveness of strategies targeted at the wider context and organisational levels and examining the costs and cost-effectiveness of implementation strategies. PROSPERO registration number CRD42014009410. PMID:26700290
VMAT optimization with dynamic collimator rotation.
Lyu, Qihui; O'Connor, Daniel; Ruan, Dan; Yu, Victoria; Nguyen, Dan; Sheng, Ke
2018-04-16
Although collimator rotation is an optimization variable that can be exploited for dosimetric advantages, existing Volumetric Modulated Arc Therapy (VMAT) optimization uses a fixed collimator angle in each arc and only rotates the collimator between arcs. In this study, we develop a novel integrated optimization method for VMAT, accounting for dynamic collimator angles during the arc motion. Direct Aperture Optimization (DAO) for Dynamic Collimator in VMAT (DC-VMAT) was achieved by adding to the existing dose fidelity objective an anisotropic total variation term for regulating the fluence smoothness, a binary variable for forming simple apertures, and a group sparsity term for controlling collimator rotation. The optimal collimator angle for each beam angle was selected using the Dijkstra's algorithm, where the node costs depend on the estimated fluence map at the current iteration and the edge costs account for the mechanical constraints of multi-leaf collimator (MLC). An alternating optimization strategy was implemented to solve the DAO and collimator angle selection (CAS). Feasibility of DC-VMAT using one full-arc with dynamic collimator rotation was tested on a phantom with two small spherical targets, a brain, a lung and a prostate cancer patient. The plan was compared against a static collimator VMAT (SC-VMAT) plan using three full arcs with 60 degrees of collimator angle separation in patient studies. With the same target coverage, DC-VMAT achieved 20.3% reduction of R50 in the phantom study, and reduced the average max and mean OAR dose by 4.49% and 2.53% of the prescription dose in patient studies, as compared with SC-VMAT. The collimator rotation co-ordinated with the gantry rotation in DC-VMAT plans for deliverability. There were 13 beam angles in the single-arc DC-VMAT plan in patient studies that requires slower gantry rotation to accommodate multiple collimator angles. The novel DC-VMAT approach utilizes the dynamic collimator rotation during arc delivery. In doing so, DC-VMAT affords more sophisticated intensity modulation, alleviating the limitation previously imposed by the square beamlet from the MLC leaf thickness and achieves higher effective modulation resolution. Consequently, DC-VMAT with a single arc manages to achieve superior dosimetry than SC-VMAT with three full arcs. © 2018 American Association of Physicists in Medicine.
An operational model for mainstreaming ecosystem services for implementation
Cowling, Richard M.; Egoh, Benis; Knight, Andrew T.; O'Farrell, Patrick J.; Reyers, Belinda; Rouget, Mathieu; Roux, Dirk J.; Welz, Adam; Wilhelm-Rechman, Angelika
2008-01-01
Research on ecosystem services has grown markedly in recent years. However, few studies are embedded in a social process designed to ensure effective management of ecosystem services. Most research has focused only on biophysical and valuation assessments of putative services. As a mission-oriented discipline, ecosystem service research should be user-inspired and user-useful, which will require that researchers respond to stakeholder needs from the outset and collaborate with them in strategy development and implementation. Here we provide a pragmatic operational model for achieving the safeguarding of ecosystem services. The model comprises three phases: assessment, planning, and management. Outcomes of social, biophysical, and valuation assessments are used to identify opportunities and constraints for implementation. The latter then are transformed into user-friendly products to identify, with stakeholders, strategic objectives for implementation (the planning phase). The management phase undertakes and coordinates actions that achieve the protection of ecosystem services and ensure the flow of these services to beneficiaries. This outcome is achieved via mainstreaming, or incorporating the safeguarding of ecosystem services into the policies and practices of sectors that deal with land- and water-use planning. Management needs to be adaptive and should be institutionalized in a suite of learning organizations that are representative of the sectors that are concerned with decision-making and planning. By following the phases of our operational model, projects for safeguarding ecosystem services are likely to empower stakeholders to implement effective on-the-ground management that will achieve resilience of the corresponding social-ecological systems. PMID:18621695
A Smart Load Interface and Voltage Regulator for Electrostatic Vibration Energy Harvester
NASA Astrophysics Data System (ADS)
Bedier, Mohammed; Basset, Philippe; Galayko, Dimitri
2016-11-01
This paper presents a new implementation in ams 0.35μm HV technology of a complete energy management system for an electrostatic vibrational energy harvester (e-VEH). It is based on the Bennet's doubler architecture and includes a load voltage regulator (LVR) and a smart Load Interface (LI) that are self-controlled with internal voltages for maximum power point tracking (MMPT). The CMOS implementation makes use of an energy harvester that is capable of producing up to 1.8μW at harmonic excitation, given its internal voltage is kept within its optimum. An intermediate LI stage and its controller makes use of a high side switch with zero static power level shifter, and a low power hysteresis comparator. A full circuit level simulation with a VHDL-AMS model of the e-VEH presented was successfully achieved, indicating that the proposed load interface controller consumes less than 100nW average power. Moreover, a LVR regulates the buffer and discharge the harvested energy into a generic resistive load maintaining the voltage within a nominal value of 2 Volts.
Parallel Implementation of MAFFT on CUDA-Enabled Graphics Hardware.
Zhu, Xiangyuan; Li, Kenli; Salah, Ahmad; Shi, Lin; Li, Keqin
2015-01-01
Multiple sequence alignment (MSA) constitutes an extremely powerful tool for many biological applications including phylogenetic tree estimation, secondary structure prediction, and critical residue identification. However, aligning large biological sequences with popular tools such as MAFFT requires long runtimes on sequential architectures. Due to the ever increasing sizes of sequence databases, there is increasing demand to accelerate this task. In this paper, we demonstrate how graphic processing units (GPUs), powered by the compute unified device architecture (CUDA), can be used as an efficient computational platform to accelerate the MAFFT algorithm. To fully exploit the GPU's capabilities for accelerating MAFFT, we have optimized the sequence data organization to eliminate the bandwidth bottleneck of memory access, designed a memory allocation and reuse strategy to make full use of limited memory of GPUs, proposed a new modified-run-length encoding (MRLE) scheme to reduce memory consumption, and used high-performance shared memory to speed up I/O operations. Our implementation tested in three NVIDIA GPUs achieves speedup up to 11.28 on a Tesla K20m GPU compared to the sequential MAFFT 7.015.
Reilly pulls it together with care
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiesche, E.S.
1992-12-09
Reilly Industries (Indianapolis) has changed strategic planning procedures to incorporate Responsible Care into its business plans. Each of the company's business units budgets for Responsible Care and reports quarterly on progress in implementing the codes, says Jacqueline Fernette, corporate communications coordinator and Responsible Care coordinator. The company's goal is to achieve full implementation by the end of 1997. In Reilly's 1993 budget, 20% of capital is directed at Responsible Care, says president Robert McNeeley. We hold unit managers responsible for planning Responsible Care within their businesses and reporting on them on a quarterly basis, says McNeeley. The firm makes pyridine,more » coal tar, potash and related chemicals, and specialized esters, and posts annual in the $250 million-$300 million range. Reilly has seven plants and 900 employees. Incorporating Responsible Care into the strategic business plan required a fair amount of administrative work to make sure that all business unit managers understood the concepts and were working in comparable terms, says McNeeley. We needed to bring the managers up to speed in six codes, so there was a training aspect to it.« less
GPU Lossless Hyperspectral Data Compression System
NASA Technical Reports Server (NTRS)
Aranki, Nazeeh I.; Keymeulen, Didier; Kiely, Aaron B.; Klimesh, Matthew A.
2014-01-01
Hyperspectral imaging systems onboard aircraft or spacecraft can acquire large amounts of data, putting a strain on limited downlink and storage resources. Onboard data compression can mitigate this problem but may require a system capable of a high throughput. In order to achieve a high throughput with a software compressor, a graphics processing unit (GPU) implementation of a compressor was developed targeting the current state-of-the-art GPUs from NVIDIA(R). The implementation is based on the fast lossless (FL) compression algorithm reported in "Fast Lossless Compression of Multispectral-Image Data" (NPO- 42517), NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26, which operates on hyperspectral data and achieves excellent compression performance while having low complexity. The FL compressor uses an adaptive filtering method and achieves state-of-the-art performance in both compression effectiveness and low complexity. The new Consultative Committee for Space Data Systems (CCSDS) Standard for Lossless Multispectral & Hyperspectral image compression (CCSDS 123) is based on the FL compressor. The software makes use of the highly-parallel processing capability of GPUs to achieve a throughput at least six times higher than that of a software implementation running on a single-core CPU. This implementation provides a practical real-time solution for compression of data from airborne hyperspectral instruments.
NASA Technical Reports Server (NTRS)
Ly, U. L.; Ho, J. K.
1986-01-01
A systematic procedure for the synthesis of fault tolerant control laws to actuator failure has been presented. Two design methods were used to synthesize fault tolerant controllers: the conventional LQ design method and a direct feedback controller design method SANDY. The latter method is used primarily to streamline the full-state Q feedback design into a practical implementable output feedback controller structure. To achieve robustness to control actuator failure, the redundant surfaces are properly balanced according to their control effectiveness. A simple gain schedule based on the landing gear up/down logic involving only three gains was developed to handle three design flight conditions: Mach .25 and Mach .60 at 5000 ft and Mach .90 at 20,000 ft. The fault tolerant control law developed in this study provides good stability augmentation and performance for the relaxed static stability aircraft. The augmented aircraft responses are found to be invariant to the presence of a failure. Furthermore, single-loop stability margins of +6 dB in gain and +30 deg in phase were achieved along with -40 dB/decade rolloff at high frequency.
Fan, Wenpei; Shen, Bo; Bu, Wenbo; Zheng, Xiangpeng; He, Qianjun; Cui, Zhaowen; Ni, Dalong; Zhao, Kuaile; Zhang, Shengjian; Shi, Jianlin
2015-11-01
Biophotonic technology that uses light and ionizing radiation for positioned cancer therapy is a holy grail in the field of biomedicine because it can overcome the systemic toxicity and adverse side effects of conventional chemotherapy. However, the existing biophotonic techniques fail to achieve the satisfactory treatment efficacy, which remains a big challenge for clinical implementation. Herein, we develop a novel theranostic technique of "intranuclear biophotonics" by the smart design of a nuclear-targeting biophotonic system based on photo-/radio-sensitizers covalently co-loaded upconversion nanoparticles. These nuclear-targeting biophotonic agents can not only generate a great deal of multiple cytotoxic reactive oxygen species in the nucleus by making full use of NIR/X-ray irradiation, but also produce greatly enhanced intranuclear synergetic radio-/photodynamic therapeutic effects under the magnetic/luminescent bimodal imaging guidance, which may achieve the optimal efficacy in treating radio-resistant tumors. We anticipate that the highly effective intranuclear biophotonics will contribute significantly to the development of biophotonic techniques and open new perspectives for a variety of cancer theranostic applications. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of a Diabetes Education Program for Youth With Type 2 Diabetes
Grey, Margaret; Schreiner, Barbara; Pyle, Laura
2009-01-01
Purpose The purpose of this article is to present the Treatment Options for Type 2 Diabetes in Adolescents and Youth (TOD2AY) study and a description of the implementation of the standard diabetes education (SDE) program. Methods A total of 218 participants (one third of the eventual sample of 750) were initially enrolled in the study. To date, the mean age of participants was 14.3 + 2.1 years, with 63% being female. Families of study participants were largely low or middle income (more than half report family income <$35 000) and about three-quarters were minority. Results More than three-quarters (79%) of families achieved full mastery of the entire SDE program. Mastery required on average 5.5 + 1.3 sessions. In addition, 62% of the families were able to achieve mastery of the session topic in a single visit. Conclusions In summary, the TOD2AY study SDE program fills the need for effective, engaging materials for youth and their families to use in mastering essential type 2 diabetes skills and knowledge. PMID:19244566
Optical implementation of spin squeezing
NASA Astrophysics Data System (ADS)
Ono, Takafumi; Sabines-Chesterking, Javier; Cable, Hugo; O'Brien, Jeremy L.; Matthews, Jonathan C. F.
2017-05-01
Quantum metrology enables estimation of optical phase shifts with precision beyond the shot-noise limit. One way to exceed this limit is to use squeezed states, where the quantum noise of one observable is reduced at the expense of increased quantum noise for its complementary partner. Because shot-noise limits the phase sensitivity of all classical states, reduced noise in the average value for the observable being measured allows for improved phase sensitivity. However, additional phase sensitivity can be achieved using phase estimation strategies that account for the full distribution of measurement outcomes. Here we experimentally investigate a model of optical spin-squeezing, which uses post-selection and photon subtraction from the state generated using a parametric downconversion photon source, and we investigate the phase sensitivity of this model. The Fisher information for all photon-number outcomes shows it is possible to obtain a quantum advantage of 1.58 compared to the shot-noise value for five-photon events, even though due to experimental imperfection, the average noise for the relevant spin-observable does not achieve sub-shot-noise precision. Our demonstration implies improved performance of spin squeezing for applications to quantum metrology.
Harper, Joann; Hinds, Pamela S; Baker, Justin N; Hicks, Judy; Spunt, Sheri L; Razzouk, Bassem I
2007-01-01
Children living with and dying of advanced-stage cancer suffer physically, emotionally, and spiritually. Relief of their suffering requires comprehensive, compassionate palliative and end-of-life (EoL) care.However, an EoL care program might appear inconsistent with the mission of a pediatric oncology research center committed to seeking cures. Here the authors describe the methods used to achieve full institutional commitment to their EoL care program and those used to build the program's philosophical, research, and educational foundations after they received approval. The authors convened 10 focus groups to solicit staff perceptions of the hospital's current palliative and EoL care. They also completed baseline medical record reviews of 145 patient records to identify key EoL characteristics. The authors then crafted a vision statement and a strategic plan, implemented new research protocols,and established publication and funding trajectories. They conclude that establishing a state-of-the-art palliative and EoL program in a cure-oriented pediatric setting is achievable via consensus building and recruitment of diverse institutional resources.
Development of an Enhanced Payback Function for the Superior Energy Performance Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Therkelsen, Peter; Rao, Prakash; McKane, Aimee
2015-08-03
The U.S. DOE Superior Energy Performance (SEP) program provides recognition to industrial and commercial facilities that achieve certification to the ISO 50001 energy management system standard and third party verification of energy performance improvements. Over 50 industrial facilities are participating and 28 facilities have been certified in the SEP program. These facilities find value in the robust, data driven energy performance improvement result that the SEP program delivers. Previous analysis of SEP certified facility data demonstrated the cost effectiveness of SEP and identified internal staff time to be the largest cost component related to SEP implementation and certification. This papermore » analyzes previously reported and newly collected data of costs and benefits associated with the implementation of an ISO 50001 and SEP certification. By disaggregating “sunk energy management system (EnMS) labor costs”, this analysis results in a more accurate and detailed understanding of the costs and benefits of SEP participation. SEP is shown to significantly improve and sustain energy performance and energy cost savings, resulting in a highly attractive return on investment. To illustrate these results, a payback function has been developed and is presented. On average facilities with annual energy spend greater than $2M can expect to implement SEP with a payback of less than 1.5 years. Finally, this paper also observes and details decreasing facility costs associated with implementing ISO 50001 and certifying to the SEP program, as the program has improved from pilot, to demonstration, to full launch.« less
Experimental implementation of a Pyramid WFS: Towards the first SCAO systems for E-ELT
NASA Astrophysics Data System (ADS)
Bond, C.; El Hadi, K.; Sauvage, J. F.; Correia, C.; Fauvarque, O.; Rabaud, D.; Neichel, B.; Fusco, T.
2015-12-01
Investigations into the Pyramid wavefront sensor (P-WFS) have experimentally demonstrated the ability to achieve a better performance than with a standard Shack-Hartmann sensor (SH-WFS). Implementation on the Large Binocular Telescope (LBT) provided the first operational demonstration on a facility-class instrument of a P-WFS on sky. The desire to implement a Pyramid on an Extremely Large Telescope (ELT) requires further characterisation in order to optimise the performance and match our knowledge and understanding of other wave-front sensors (WFSs). Within the framework of the European Extremely Large Telescope (E-ELT), the Laboratoire d'Astrophysique de Marseille (LAM) is involved in the preparation of the Single Conjugate Adaptive Optics (SCAO) system of HARMONI, E-ELT's 1st light integral field spectrograph (IFU). The current baseline WFS for this adaptive optics system is a Pyramid WFS using a high speed and sensitive OCAM2 camera. At LAM we are currently carrying out laboratory demonstrations of a Pyramid-WFS, with the aim to fully characterise the behaviour of the Pyramid in terms of sensitivity and linear range. This will lead to a full operational procedure for the use of the Pyramid on-sky, assisting with current designs and future implementations. The final goal is to provide an on sky comparison between the Pyramid and Shack-Hartmann at Observatoire de la Côte d'Azur (OCA). Here we present our experimental setup and preliminary results.
Implementation of a cone-beam backprojection algorithm on the cell broadband engine processor
NASA Astrophysics Data System (ADS)
Bockenbach, Olivier; Knaup, Michael; Kachelrieß, Marc
2007-03-01
Tomographic image reconstruction is computationally very demanding. In all cases the backprojection represents the performance bottleneck due to the high operational count and due to the high demand put on the memory subsystem. In the past, solving this problem has lead to the implementation of specific architectures, connecting Application Specific Integrated Circuits (ASICs) or Field Programmable Gate Arrays (FPGAs) to memory through dedicated high speed busses. More recently, there have also been attempt to use Graphic Processing Units (GPUs) to perform the backprojection step. Originally aimed at the gaming market, IBM, Toshiba and Sony have introduced the Cell Broadband Engine (CBE) processor, often considered as a multicomputer on a chip. Clocked at 3 GHz, the Cell allows for a theoretical performance of 192 GFlops and a peak data transfer rate over the internal bus of 200 GB/s. This performance indeed makes the Cell a very attractive architecture for implementing tomographic image reconstruction algorithms. In this study, we investigate the relative performance of a perspective backprojection algorithm when implemented on a standard PC and on the Cell processor. We compare these results to the performance achievable with FPGAs based boards and high end GPUs. The cone-beam backprojection performance was assessed by backprojecting a full circle scan of 512 projections of 1024x1024 pixels into a volume of size 512x512x512 voxels. It took 3.2 minutes on the PC (single CPU) and is as fast as 13.6 seconds on the Cell.
Implementing the Full-Day Kindergarten.
ERIC Educational Resources Information Center
Fromberg, Doris Pronin
1992-01-01
Considerations ranging from lunch counter heights to bus schedules, parent workshops, and adjustment periods must concern principals implementing full-day kindergartens. Many schools will also face doubled art supply budgets and increased staffing costs for specialized library, physical education, music, and art education services. (four…
MRP (materiel requirements planning) II: successful implementation the hard way.
Grubbs, S C
1994-05-01
Many manufacturing companies embark on MRP II implementation projects as a method for improvement. In spite of an increasing body of knowledge regarding successful implementations, companies continue to attempt new approaches. This article reviews an actual implementation, featuring some of the mistakes made and the efforts required to still achieve "Class A" performance levels.
TAIR- TRANSONIC AIRFOIL ANALYSIS COMPUTER CODE
NASA Technical Reports Server (NTRS)
Dougherty, F. C.
1994-01-01
The Transonic Airfoil analysis computer code, TAIR, was developed to employ a fast, fully implicit algorithm to solve the conservative full-potential equation for the steady transonic flow field about an arbitrary airfoil immersed in a subsonic free stream. The full-potential formulation is considered exact under the assumptions of irrotational, isentropic, and inviscid flow. These assumptions are valid for a wide range of practical transonic flows typical of modern aircraft cruise conditions. The primary features of TAIR include: a new fully implicit iteration scheme which is typically many times faster than classical successive line overrelaxation algorithms; a new, reliable artifical density spatial differencing scheme treating the conservative form of the full-potential equation; and a numerical mapping procedure capable of generating curvilinear, body-fitted finite-difference grids about arbitrary airfoil geometries. Three aspects emphasized during the development of the TAIR code were reliability, simplicity, and speed. The reliability of TAIR comes from two sources: the new algorithm employed and the implementation of effective convergence monitoring logic. TAIR achieves ease of use by employing a "default mode" that greatly simplifies code operation, especially by inexperienced users, and many useful options including: several airfoil-geometry input options, flexible user controls over program output, and a multiple solution capability. The speed of the TAIR code is attributed to the new algorithm and the manner in which it has been implemented. Input to the TAIR program consists of airfoil coordinates, aerodynamic and flow-field convergence parameters, and geometric and grid convergence parameters. The airfoil coordinates for many airfoil shapes can be generated in TAIR from just a few input parameters. Most of the other input parameters have default values which allow the user to run an analysis in the default mode by specifing only a few input parameters. Output from TAIR may include aerodynamic coefficients, the airfoil surface solution, convergence histories, and printer plots of Mach number and density contour maps. The TAIR program is written in FORTRAN IV for batch execution and has been implemented on a CDC 7600 computer with a central memory requirement of approximately 155K (octal) of 60 bit words. The TAIR program was developed in 1981.
Pérez-Gómez, Augusto; Mejía-Trujillo, Juliana; Brown, Eric C.; Eisenberg, Nicole
2016-01-01
During the last 2 years, the Colombian government and the Nuevos Rumbos Corporation have been implementing an adapted version of the Communities That Care (CTC) prevention system, called Comunidades Que se Cuidan (CQC) in Spanish, for use in Colombia. This brief report presents the process of implementing CQC and identifies some of the main challenges and achievements of implementing the system in eight communities in Colombia. Preliminary results of a pilot study of CQC implementation in Colombia show that prevention system development, including a focus on measuring community risk and protection, can be established successfully in Latin American communities despite a lack of rigorously tested prevention programs and strategies. Moreover, mobilizing community coalitions toward science-based prevention, with a focus on examining local risk and protective factor data, can spur development and evaluation of prevention efforts in Latin America. PMID:28154437
Machine-checked proofs of the design and implementation of a fault-tolerant circuit
NASA Technical Reports Server (NTRS)
Bevier, William R.; Young, William D.
1990-01-01
A formally verified implementation of the 'oral messages' algorithm of Pease, Shostak, and Lamport is described. An abstract implementation of the algorithm is verified to achieve interactive consistency in the presence of faults. This abstract characterization is then mapped down to a hardware level implementation which inherits the fault-tolerant characteristics of the abstract version. All steps in the proof were checked with the Boyer-Moore theorem prover. A significant results is the demonstration of a fault-tolerant device that is formally specified and whose implementation is proved correct with respect to this specification. A significant simplifying assumption is that the redundant processors behave synchronously. A mechanically checked proof that the oral messages algorithm is 'optimal' in the sense that no algorithm which achieves agreement via similar message passing can tolerate a larger proportion of faulty processor is also described.
Safety margins in the implementation of planetary quarantine requirements
NASA Technical Reports Server (NTRS)
Schalkowsky, S.; Jacoby, I.
1972-01-01
The formulation of planetary quarantine requirements, and their implementation as determined by a risk allocation model, is discussed. The model defines control safety margins with particular emphasis on utility in achieving the desired minimization of excessive margins, and their effect on implementation procedures.
Power Quality Improvement Using an Enhanced Network-Side-Shunt-Connected Dynamic Voltage Restorer
NASA Astrophysics Data System (ADS)
Fereidouni, Alireza; Masoum, Mohammad A. S.; Moghbel, Moayed
2015-10-01
Among the four basic dynamic voltage restorer (DVR) topologies, the network-side shunt-connected DVR (NSSC-DVR) has a relatively poor performance and is investigated in this paper. A new configuration is proposed and implemented for NSSC-DVR to enhance its performance in compensating (un)symmetrical deep and long voltage sags and mitigate voltage harmonics. The enhanced NSSC-DVR model includes a three-phase half-bridge semi-controlled network-side-shunt-connected rectifier and a three-phase full-bridge series-connected inverter implemented with a back-to-back configuration through a bidirectional buck-boost converter. The network-side-shunt-connected rectifier is employed to inject/draw the required energy by NSSC-DVR to restore the load voltage to its pre-fault value under sag/swell conditions. The buck-boost converter is responsible for maintaining the DC-link voltage of the series-connected inverter at its designated value in order to improve the NSSC-DVR capability in compensating deep and long voltage sags/swells. The full-bridge series-connected inverter permits to compensate unbalance voltage sags containing zero-sequence component. The harmonic compensation of the load voltage is achieved by extracting harmonics from the distorted network voltage using an artificial neural network (ANN) method called adaptive linear neuron (Adaline) strategy. Detailed simulations are performed by SIMULINK/MATLAB software for six case studies to verify the highly robustness of the proposed NSSC-DVR model under various conditions.
Implementing a Structured Reporting Initiative Using a Collaborative Multistep Approach.
Goldberg-Stein, Shlomit; Walter, William R; Amis, E Stephen; Scheinfeld, Meir H
To describe the successful implementation of a structured reporting initiative in a large urban academic radiology department. We describe our process, compromises, and top 10 lessons learned in overhauling traditional reporting practices and comprehensively implementing structured reporting at our institution. To achieve our goals, we took deliberate steps toward consensus building, undertook multistep template refinement, and achieved close collaboration with the technical staff, department coders, and hospital information technologists. Following institutional review board exemption, we audited radiologist compliance by evaluating 100 consecutive cases of 12 common examination types. Fisher exact test was applied to determine significance of association between trainee initial report drafting and template compliance. We produced and implemented structured reporting templates for 95% of all departmental computed tomography, magnetic resonance, and ultrasound examinations. Structured templates include specialized reports adhering to the American College of Radiology's Reporting and Data Systems (ACR's RADS) recommendations (eg, Lung-RADS and Li-RADS). We attained 94% radiologist compliance within 2 years, without any financial incentives. We provide a blueprint of how to successfully achieve structured reporting using a collaborative multistep approach. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Hornstra, Lisette; van der Veen, Ineke; Peetsma, Thea
2017-01-01
This study focused on effects of high-ability programs on students' achievement emotions, i.e. emotions that students experience that are associated with achievement activities. Participants were students in grade 4-6 of primary education: 218 students attended full-time high-ability programs, 245 attended part-time high-ability programs (i.e.…
Space systems engineering and risk management - joined at the hip
NASA Technical Reports Server (NTRS)
Rose, James R.
2004-01-01
This paper explores the separate skills and capabilities practiced until now, and the powerful coupling to be achieved, practically and effectively, in implementing a space mission, from inception (pre-phase A) to the end of Operations (phase E). The use of risk assessment techniques in balancing cost risk against performance risk, and the application of the systems engineering team in these trades, is the key to achieving this new implementation paradigm.
Dynamic full-field infrared imaging with multiple synchrotron beams
Stavitski, Eli; Smith, Randy J.; Bourassa, Megan W.; Acerbo, Alvin S.; Carr, G. L.; Miller, Lisa M.
2013-01-01
Microspectroscopic imaging in the infrared (IR) spectral region allows for the examination of spatially resolved chemical composition on the microscale. More than a decade ago, it was demonstrated that diffraction limited spatial resolution can be achieved when an apertured, single pixel IR microscope is coupled to the high brightness of a synchrotron light source. Nowadays, many IR microscopes are equipped with multi-pixel Focal Plane Array (FPA) detectors, which dramatically improve data acquisition times for imaging large areas. Recently, progress been made toward efficiently coupling synchrotron IR beamlines to multi-pixel detectors, but they utilize expensive and highly customized optical schemes. Here we demonstrate the development and application of a simple optical configuration that can be implemented on most existing synchrotron IR beamlines in order to achieve full-field IR imaging with diffraction-limited spatial resolution. Specifically, the synchrotron radiation fan is extracted from the bending magnet and split into four beams that are combined on the sample, allowing it to fill a large section of the FPA. With this optical configuration, we are able to oversample an image by more than a factor of two, even at the shortest wavelengths, making image restoration through deconvolution algorithms possible. High chemical sensitivity, rapid acquisition times, and superior signal-to-noise characteristics of the instrument are demonstrated. The unique characteristics of this setup enabled the real time study of heterogeneous chemical dynamics with diffraction-limited spatial resolution for the first time. PMID:23458231
Virtual temporal bone dissection system: OSU virtual temporal bone system: development and testing.
Wiet, Gregory J; Stredney, Don; Kerwin, Thomas; Hittle, Bradley; Fernandez, Soledad A; Abdel-Rasoul, Mahmoud; Welling, D Bradley
2012-03-01
The objective of this project was to develop a virtual temporal bone dissection system that would provide an enhanced educational experience for the training of otologic surgeons. A randomized, controlled, multi-institutional, single-blinded validation study. The project encompassed four areas of emphasis: structural data acquisition, integration of the system, dissemination of the system, and validation. Structural acquisition was performed on multiple imaging platforms. Integration achieved a cost-effective system. Dissemination was achieved on different levels including casual interest, downloading of software, and full involvement in development and validation studies. A validation study was performed at eight different training institutions across the country using a two-arm randomized trial where study subjects were randomized to a 2-week practice session using either the virtual temporal bone or standard cadaveric temporal bones. Eighty subjects were enrolled and randomized to one of the two treatment arms; 65 completed the study. There was no difference between the two groups using a blinded rating tool to assess performance after training. A virtual temporal bone dissection system has been developed and compared to cadaveric temporal bones for practice using a multicenter trial. There was no statistical difference between practice on the current simulator compared to practice on human cadaveric temporal bones. Further refinements in structural acquisition and interface design have been identified, which can be implemented prior to full incorporation into training programs and used for objective skills assessment. Copyright © 2012 The American Laryngological, Rhinological, and Otological Society, Inc.
Overview of decade-long development of plasma-facing components at ASIPP
NASA Astrophysics Data System (ADS)
Luo, G.-N.; Liu, G. H.; Li, Q.; Qin, S. G.; Wang, W. J.; Shi, Y. L.; Xie, C. Y.; Chen, Z. M.; Missirlian, M.; Guilhem, D.; Richou, M.; Hirai, T.; Escourbiac, F.; Yao, D. M.; Chen, J. L.; Wang, T. J.; Bucalossi, J.; Merola, M.; Li, J. G.; EAST Team
2017-06-01
The first EAST (Experimental Advanced Superconducting Tokamak) plasma ignited in 2006 with non-actively cooled steel plates as the plasma-facing materials and components (PFMCs) which were then upgraded into full graphite tiles bolted onto water-cooled copper heat sinks in 2008. The first wall was changed further into molybdenum alloy in 2012, while keeping the graphite for both the upper and lower divertors. With the rapid increase in heating and current driving power in EAST, the W/Cu divertor project was launched around the end of 2012, aiming at achieving actively cooled full W/Cu-PFCs for the upper divertor, with heat removal capability up to 10 MW m-2. The W/Cu upper divertor was finished in the spring of 2014, consisting of 80 cassette bodies toroidally assembled. Commissioning of the EAST upper W/Cu divertor in 2014 was unsatisfactory and then several practical measures were implemented to improve the design, welding quality and reliability, which helped us achieve successful commissioning in the 2015 Spring Campaign. In collaboration with the IO and CEA teams, we have demonstrated our technological capability to remove heat loads of 5000 cycles at 10 MW m-2 and 1000 cycles at 20 MW m-2 for the small scale monoblock mockups, and surprisingly over 300 cycles at 20 MW m-2 for the flat-tile ones. The experience and lessons we learned from batch production and commissioning are undoubtedly valuable for ITER (International Thermonuclear Experimental Reactor) engineering validation and tungsten-related plasma physics.
Role of public-private partnership in micronutrient food fortification.
Mannar, M G Venkatesh; van Ameringen, Marc
2003-12-01
Iron, iodine, and vitamin A deficiencies prevent 30% of the world's population from reaching full physical and mental potential. Fortification of commonly eaten foods with micronutrients offers a cost-effective solution that can reach large populations. Effective and sustainable fortification will be possible only if the public sector (which has the mandate and responsibility to improve the health of the population), the private sector (which has experience and expertise in food production and marketing), and the social sector (which has grass-roots contact with the consumer) collaborate to develop, produce, and promote micronutrient-fortified foods. Food fortification efforts must be integrated within the context of a country's public health and nutrition situation as part of an overall micronutrient strategy that utilizes other interventions as well. Identifying a set of priority actions and initiating a continuous dialogue between the various sectors to catalyze the implementation of schemes that will permanently eliminate micronutrient malnutrition are urgently needed. The partners of such a national alliance must collaborate closely on specific issues relating to the production, promotion, distribution, and consumption of fortified foods. Such collaboration could benefit all sectors: National governments could reap national health, economic, and political benefits; food companies could gain a competitive advantage in an expanding consumer marketplace; the scientific, development, and donor communities could make an impact by achieving global goals for eliminating micronutrient malnutrition; and by demanding fortified foods, consumers empower themselves to achieve their full social and economic potential.
ERIC Educational Resources Information Center
Cheek, Annesa LeShawn
2011-01-01
Achieving the Dream is a national initiative focused on helping more community college students succeed, particularly students of color and low-income students. Achieving the Dream's student-centered model of institutional improvement focuses on eliminating gaps and raising student achievement by helping institutions build a culture of evidence…
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
To raise the achievement of every student in the state, Maryland implemented "Achievement Matters Most," a new plan for public elementary and secondary schools that sets goals in the areas of achievement, teaching, testing, safety, and family involvement in schools. This Arabic-language guide for parents outlines the goals and…
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
To raise the achievement of every student in the state, Maryland implemented "Achievement Matters Most," a new plan for public elementary and secondary schools that sets goals in the areas of achievement, teaching, testing, safety, and family involvement in schools. This Gujarati-language guide for parents outlines the goals and…
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
To raise the achievement of every student in the state, Maryland implemented "Achievement Matters Most," a new plan for public elementary and secondary schools that sets goals in the areas of achievement, teaching, testing, safety, and family involvement in schools. This Chinese-language guide for parents outlines the goals and…
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
To raise the achievement of every student in the state, Maryland implemented "Achievement Matters Most," a new plan for public elementary and secondary schools that sets goals in the areas of achievement, teaching, testing, safety, and family involvement in schools. This Urdu-language guide for parents outlines the goals and…
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
To raise the achievement of every student in the state, Maryland implemented "Achievement Matters Most," a new plan for public elementary and secondary schools that sets goals in the areas of achievement, teaching, testing, safety, and family involvement in schools. This Korean-language guide for parents outlines the goals and…
A Parent's Guide to Achievement Matters Most: Maryland's Plan for PreK-12 Education, 2002-2003.
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
To raise the achievement of every student in the state, Maryland implemented "Achievement Matters Most," a new plan for public elementary and secondary schools that sets goals in the areas of achievement, teaching, testing, safety, and family involvement in schools. This guide for parents outlines the goals and characteristics of the…
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
To raise the achievement of every student in the state, Maryland implemented "Achievement Matters Most," a new plan for public elementary and secondary schools that sets goals in the areas of achievement, teaching, testing, safety, and family involvement in schools. This Russian-language guide for parents outlines the goals and…
ERIC Educational Resources Information Center
Scoggins, Donna K.
2009-01-01
Single-sex education is an instructional innovation implemented to improve student academic achievement by teaching to the learning styles and interests of boys and/or girls. This ex post facto quantitative study examined the differences in academic achievement between single-sex education and coeducation classes on students' achievement in…
Technology Implementation in Education--Identifying Barriers to Fidelity
ERIC Educational Resources Information Center
Dennis, William J.; Johnson, Daniel L.; Monroe, Arla K.
2012-01-01
This report describes a problem-based learning project focused on determining the barriers to the implementation of technological innovations. Research findings offered evidence that properly executed technology implementation is an instructional variable related to student achievement; yet, school district leaders are faced with the problem of…
Technology Implementation in Education--Identifying Barriers to Fidelity
ERIC Educational Resources Information Center
Monroe, Arla K.; Dennis, William J.; Johnson, Daniel L.
2012-01-01
This report describes a problem-based learning project focused on determining the barriers to the implementation of technological innovations. that properly executed technology implementation is an instructional variable related to student achievement; yet, school district leaders are faced with the problem of recognizing and identifying the…
Digital signal processing techniques for coherent optical communication
NASA Astrophysics Data System (ADS)
Goldfarb, Gilad
Coherent detection with subsequent digital signal processing (DSP) is developed, analyzed theoretically and numerically and experimentally demonstrated in various fiber-optic transmission scenarios. The use of DSP in conjunction with coherent detection unleashes the benefits of coherent detection which rely on the preservaton of full information of the incoming field. These benefits include high receiver sensitivity, the ability to achieve high spectral-efficiency and the use of advanced modulation formats. With the immense advancements in DSP speeds, many of the problems hindering the use of coherent detection in optical transmission systems have been eliminated. Most notably, DSP alleviates the need for hardware phase-locking and polarization tracking, which can now be achieved in the digital domain. The complexity previously associated with coherent detection is hence significantly diminished and coherent detection is once gain considered a feasible detection alternative. In this thesis, several aspects of coherent detection (with or without subsequent DSP) are addressed. Coherent detection is presented as a means to extend the dispersion limit of a duobinary signal using an analog decision-directed phase-lock loop. Analytical bit-error ratio estimation for quadrature phase-shift keying signals is derived. To validate the promise for high spectral efficiency, the orthogonal-wavelength-division multiplexing scheme is suggested. In this scheme the WDM channels are spaced at the symbol rate, thus achieving the spectral efficiency limit. Theory, simulation and experimental results demonstrate the feasibility of this approach. Infinite impulse response filtering is shown to be an efficient alternative to finite impulse response filtering for chromatic dispersion compensation. Theory, design considerations, simulation and experimental results relating to this topic are presented. Interaction between fiber dispersion and nonlinearity remains the last major challenge deterministic effects pose for long-haul optical data transmission. Experimental results which demonstrate the possibility to digitally mitigate both dispersion and nonlinearity are presented. Impairment compensation is achieved using backward propagation by implementing the split-step method. Efficient realizations of the dispersion compensation operator used in this implementation are considered. Infinite-impulse response and wavelet-based filtering are both investigated as a means to reduce the required computational load associated with signal backward-propagation. Possible future research directions conclude this dissertation.
43 CFR 418.33 - Purpose of the implementation strategy.
Code of Federal Regulations, 2010 CFR
2010-10-01
... implement through their discretionary actions, operating strategies which achieve the principles of this... project operations, economics, and environmental effects. (c) The efficiency target will be used as a...
NASA Astrophysics Data System (ADS)
Xie, Chang; Wen, Jing; Liu, Wenying; Wang, Jiaming
With the development of intelligent dispatching, the intelligence level of network control center full-service urgent need to raise. As an important daily work of network control center, the application of maintenance scheduling intelligent arrangement to achieve high-quality and safety operation of power grid is very important. By analyzing the shortages of the traditional maintenance scheduling software, this paper designs a power grid maintenance scheduling intelligence arrangement supporting system based on power flow forecasting, which uses the advanced technologies in maintenance scheduling, such as artificial intelligence, online security checking, intelligent visualization techniques. It implements the online security checking of maintenance scheduling based on power flow forecasting and power flow adjusting based on visualization, in order to make the maintenance scheduling arrangement moreintelligent and visual.
Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian
2018-01-01
Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.
Stochastic first passage time accelerated with CUDA
NASA Astrophysics Data System (ADS)
Pierro, Vincenzo; Troiano, Luigi; Mejuto, Elena; Filatrella, Giovanni
2018-05-01
The numerical integration of stochastic trajectories to estimate the time to pass a threshold is an interesting physical quantity, for instance in Josephson junctions and atomic force microscopy, where the full trajectory is not accessible. We propose an algorithm suitable for efficient implementation on graphical processing unit in CUDA environment. The proposed approach for well balanced loads achieves almost perfect scaling with the number of available threads and processors, and allows an acceleration of about 400× with a GPU GTX980 respect to standard multicore CPU. This method allows with off the shell GPU to challenge problems that are otherwise prohibitive, as thermal activation in slowly tilted potentials. In particular, we demonstrate that it is possible to simulate the switching currents distributions of Josephson junctions in the timescale of actual experiments.
NASA Technical Reports Server (NTRS)
Asenov, Asen; Slavcheva, G.; Brown, A. R.; Davies, J. H.; Saini, Subhash
1999-01-01
A detailed study of the influence of quantum effects in the inversion layer on the random dopant induced threshold voltage fluctuations and lowering in sub 0.1 micron MOSFETs has been performed. This has been achieved using a full 3D implementation of the density gradient (DG) formalism incorporated in our previously published 3D 'atomistic' simulation approach. This results in a consistent, fully 3D, quantum mechanical picture which implies not only the vertical inversion layer quantisation but also the lateral confinement effects manifested by current filamentation in the 'valleys' of the random potential fluctuations. We have shown that the net result of including quantum mechanical effects, while considering statistical fluctuations, is an increase in both threshold voltage fluctuations and lowering.
Neutron flux and power in RTP core-15
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabir, Mohamad Hairie, E-mail: m-hairie@nuclearmalaysia.gov.my; Zin, Muhammad Rawi Md; Usang, Mark Dennis
PUSPATI TRIGA Reactor achieved initial criticality on June 28, 1982. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes. This paper describes the reactor parameters calculation for the PUSPATI TRIGA REACTOR (RTP); focusing on the application of the developed reactor 3D model for criticality calculation, analysis of power and neutron flux distribution of TRIGA core. The 3D continuous energy Monte Carlo code MCNP was used to develop a versatile and accurate full model of the TRIGA reactor. The model represents in detailed all important components of the core withmore » literally no physical approximation. The consistency and accuracy of the developed RTP MCNP model was established by comparing calculations to the available experimental results and TRIGLAV code calculation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taslakian, Bedros, E-mail: btaslakian@gmail.com; Sebaaly, Mikhael Georges, E-mail: ms246@aub.edu.lb; Al-Kutoubi, Aghiad, E-mail: mk00@aub.edu.lb
2016-03-15
Performing an interventional procedure imposes a commitment on interventional radiologists to conduct the initial patient assessment, determine the best course of therapy, and provide long-term care after the procedure is completed. After patient referral, contact with the referring physician and multidisciplinary team approach is vital. In addition, clinical history, physical examination, as well as full understanding of the pre-procedural laboratory results and imaging findings can guide the interventional radiologist to implement the most appropriate management plan, avoid unnecessary procedures, and prevent complications to achieve a successful outcome. We provide a comprehensive, methodical review of pre-procedural care and management in patientsmore » undergoing vascular and interventional radiology procedures.« less
[Regulation of food advertising on television for the prevention of childhood obesity].
Hidalgo, Catalina González; Samur, Eduardo Atalah
2011-09-01
Obesity is a serious global epidemic and the prevention strategies implemented have been insufficient. Numerous environmental factors have been associated with risk of obesity and their full consideration in prevention policies is important. The connection between food advertising on television and childhood obesity has been demonstrated. The large number of advertisements for unhealthy foods targeted at children through television and its possible impact on health has led some countries to legislate on this matter. However, a conceptual framework of reference enabling legislation must be internationally defined in order to achieve a real impact in preventing childhood obesity. This paper reviews scientific evidence on the relationship between food advertising and childhood obesity as a basis for developing public policies to regulate food marketing on television.
High Efficiency Low Cost CO2 Compression Using Supersonic Shock Wave Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, J; Aarnio, M; Grosvenor, A
2010-12-31
Development and testing results from a supersonic compressor are presented. The compressor achieved record pressure ratio for a fully-supersonic stage and successfully demonstrated the technology potential. Several tasks were performed in compliance with the DOE award objectives. A high-pressure ratio compressor was retrofitted to improve rotordynamics behavior and successfully tested. An outside review panel confirmed test results and design approach. A computational fluid dynamics code used to analyze the Ramgen supersonic flowpath was extensively and successfully modified to improve use on high-performance computing platforms. A comprehensive R&D implementation plan was developed and used to lay the groundwork for a futuremore » full-scale compressor demonstration. Conceptual design for a CO2 demonstration compressor was developed and reviewed.« less
Plasma Physics Calculations on a Parallel Macintosh Cluster
NASA Astrophysics Data System (ADS)
Decyk, Viktor; Dauger, Dean; Kokelaar, Pieter
2000-03-01
We have constructed a parallel cluster consisting of 16 Apple Macintosh G3 computers running the MacOS, and achieved very good performance on numerically intensive, parallel plasma particle-in-cell simulations. A subset of the MPI message-passing library was implemented in Fortran77 and C. This library enabled us to port code, without modification, from other parallel processors to the Macintosh cluster. For large problems where message packets are large and relatively few in number, performance of 50-150 MFlops/node is possible, depending on the problem. This is fast enough that 3D calculations can be routinely done. Unlike Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. Full details are available on our web site: http://exodus.physics.ucla.edu/appleseed/.
Plasma Physics Calculations on a Parallel Macintosh Cluster
NASA Astrophysics Data System (ADS)
Decyk, Viktor K.; Dauger, Dean E.; Kokelaar, Pieter R.
We have constructed a parallel cluster consisting of 16 Apple Macintosh G3 computers running the MacOS, and achieved very good performance on numerically intensive, parallel plasma particle-in-cell simulations. A subset of the MPI message-passing library was implemented in Fortran77 and C. This library enabled us to port code, without modification, from other parallel processors to the Macintosh cluster. For large problems where message packets are large and relatively few in number, performance of 50-150 Mflops/node is possible, depending on the problem. This is fast enough that 3D calculations can be routinely done. Unlike Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. Full details are available on our web site: http://exodus.physics.ucla.edu/appleseed/.
What Is the Impact of Full Access to Technology on the Achievement of the Hispanic Student?
ERIC Educational Resources Information Center
Carr, John E., Jr.
2013-01-01
The problem studied in this research was whether the impact of full access to technology both at home and in school would affect the achievement of Hispanic students. The purpose of this study was to examine the relationship between the access to technology and the achievement of the Hispanic students at a suburban middle school. What are the…
Herschell, Amy D; Kolko, David J; Scudder, Ashley T; Taber-Thomas, Sarah; Schaffner, Kristen F; Hiegel, Shelley A; Iyengar, Satish; Chaffin, Mark; Mrozowski, Stanley
2015-09-28
Evidence-based treatments (EBTs) are available for treating childhood behavioral health challenges. Despite EBTs' potential to help children and families, they have primarily remained in university settings. Little empirical evidence exists regarding how specific, commonly used training and quality control models are effective in changing practice, achieving full implementation, and supporting positive client outcomes. This study (NIMH RO1 MH095750; ClinicalTrials.gov Identifier: NCT02543359), which is currently in progress, will evaluate the effectiveness of three training models (Learning Collaborative (LC), Cascading Model (CM), and Distance Education (DE)) to implement a well-established EBT , Parent-Child Interaction Therapy, in real-world, community settings. The three models differ in their costs, skill training, quality control methods, and capacity to address broader implementation challenges. The project is guided by three specific aims: (1) to build knowledge about training outcomes, (2) to build knowledge about implementation outcomes, and (3) to test the differential impact of training clinicians using LC, CM, and DE models on key client outcomes. Fifty (50) licensed psychiatric clinics across Pennsylvania were randomized to one of the three training conditions: (1) LC, (2) CM, or (3) DE. The impact of training on practice skills (clinician level) and implementation/sustainment outcomes (clinic level) are being evaluated at four timepoints coinciding with the training schedule: baseline, 6 (mid), 12 (post), and 24 months (1 year follow-up). Immediately after training begins, parent-child dyads (client level) are recruited from the caseloads of participating clinicians. Client outcomes are being assessed at four timepoints (pre-treatment, 1, 6, and 12 months after the pre-treatment). This proposal builds on an ongoing initiative to implement an EBT statewide. A team of diverse stakeholders including state policy makers, payers, consumers, service providers, and academics from different, but complementary areas (e.g., public health, social work, psychiatry), has been assembled to guide the research plan by incorporating input from multidimensional perspective. ClinicalTrials.gov: NCT02543359.
Massively Parallel Solution of Poisson Equation on Coarse Grain MIMD Architectures
NASA Technical Reports Server (NTRS)
Fijany, A.; Weinberger, D.; Roosta, R.; Gulati, S.
1998-01-01
In this paper a new algorithm, designated as Fast Invariant Imbedding algorithm, for solution of Poisson equation on vector and massively parallel MIMD architectures is presented. This algorithm achieves the same optimal computational efficiency as other Fast Poisson solvers while offering a much better structure for vector and parallel implementation. Our implementation on the Intel Delta and Paragon shows that a speedup of over two orders of magnitude can be achieved even for moderate size problems.
Evaluating the Implementation of Professional Learning Communities over Time
ERIC Educational Resources Information Center
Monceaux, Matthew C.
2017-01-01
Professional Learning Communities (PLCs) have become a popular reform initiative for schools looking to increase student achievement. School district officials can find it difficult to implement and sustain Professional Learning Communities as some teachers are not accustomed to the levels of collaboration with peers involved. If implemented and…
Knocking Down Barriers: How California Superintendents Are Implementing Blended Learning
ERIC Educational Resources Information Center
Horn, Michael B.; Gu, Anna; Evans, Meg
2014-01-01
School districts across the United States are implementing blended learning to boost student achievement. The authors convened several California school district superintendents to answer the questions: "What are the barriers, real or perceived, to implementing blended learning in your district?" and "Have you found solutions to or…
Managing Change in Small Scottish Primary Schools: Is There a Small School Management Style?
ERIC Educational Resources Information Center
Wilson, Valerie; McPake, Joanna
2000-01-01
Identifies management activities and strategies used by 863 heads of small Scottish schools to implement 4 major national initiatives during the past decade. Headteachers valued teamwork and employed a "plan-implement-review" strategy involving a quick audit, realistic planning for achievable targets, inclusive implementation, and…
NASA Astrophysics Data System (ADS)
Tian, Yi; Chen, Mahao; Kong, Jun
2009-02-01
With the online z-axis tube current modulation (OZTCM) technique proposed by this work, full automatic exposure control (AEC) for CT systems could be realized with online feedback not only for angular tube current modulation (TCM) but also for z-axis TCM either. Then the localizer radiograph was not required for TCM any more. OZTCM could be implemented with 2 schemes as attenuation based μ-OZTCM and image noise level based μ-OZTCM. Respectively the maximum attenuation of projection readings and standard deviation of reconstructed images can be used to modulate the tube current level in z-axis adaptively for each half (180 degree) or full (360 degree) rotation. Simulation results showed that OZTCM achieved better noise level than constant tube current scan case by using same total dose in mAs. The OZTCM can provide optimized base tube current level for angular TCM to realize an effective auto exposure control when localizer radiograph is not available or need to be skipped for simplified scan protocol in case of emergency procedure or children scan, etc.
Cano, R; Nielfa, A; Fdz-Polanco, M
2014-09-01
An economic assessment of thermal hydrolysis as a pretreatment to anaerobic digestion has been achieved to evaluate its implementation in full-scale plants. Six different solid wastes have been studied, among them municipal solid waste (MSW). Thermal hydrolysis has been tested with batch lab-scale tests, from which an energy and economic assessment of three scenarios is performed: with and without energy integration (recovering heat to produce steam in a cogeneration plant), finally including the digestate management costs. Thermal hydrolysis has lead to an increase of the methane productions (up to 50%) and kinetics parameters (even double). The study has determined that a proper energy integration design could lead to important economic savings (5 €/t) and thermal hydrolysis can enhance up to 40% the incomes of the digestion plant, even doubling them when digestate management costs are considered. In a full-scale MSW treatment plant (30,000 t/year), thermal hydrolysis would provide almost 0.5 M€/year net benefits. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Watkins, A. Neal; Leighty, Bradley D.; Lipford, William E.; Oglesby, Donald M.; Goodman, Kyle Z.; Goad, William K.; Goad, Linda R.; Massey, Edward A.
2009-01-01
The Pressure Sensitive Paint (PSP) method was used to measure global surface pressures on a model at full-scale flight Reynolds numbers. In order to achieve these conditions, the test was carried out at the National Transonic Facility (NTF) operating under cryogenic conditions in a nitrogen environment. The upper surface of a wing on a full-span 0.027 scale commercial transport was painted with a porous PSP formulation and tested at 120K. Data was acquired at Mach 0.8 with a total pressure of 200 kPa, resulting in a Reynolds number of 65 x 106/m. Oxygen, which is required for PSP operation, was injected using dry air so that the oxygen concentration in the flow was approximately 1535 ppm. Results show qualitative agreement with expected results. This preliminary test is the first time that PSP has been successfully deployed to measure global surface pressures at cryogenic condition in the NTF. This paper will describe the system as installed, the results obtained from the test, as well as proposed upgrades and future tests.
A new ultra-high-accuracy angle generator: current status and future direction
NASA Astrophysics Data System (ADS)
Guertin, Christian F.; Geckeler, Ralf D.
2017-09-01
Lack of an extreme high-accuracy angular positioning device available in the United States has left a gap in industrial and scientific efforts conducted there, requiring certain user groups to undertake time-consuming work with overseas laboratories. Specifically, in x-ray mirror metrology the global research community is advancing the state-of-the-art to unprecedented levels. We aim to fill this U.S. gap by developing a versatile high-accuracy angle generator as a part of the national metrology tool set for x-ray mirror metrology and other important industries. Using an established calibration technique to measure the errors of the encoder scale graduations for full-rotation rotary encoders, we implemented an optimized arrangement of sensors positioned to minimize propagation of calibration errors. Our initial feasibility research shows that upon scaling to a full prototype and including additional calibration techniques we can expect to achieve uncertainties at the level of 0.01 arcsec (50 nrad) or better and offer the immense advantage of a highly automatable and customizable product to the commercial market.
Gupta, Rahul; Reddy, R. Purushotham; Balasubramanian, K.; Reddy, P. S.
2018-01-01
Increasing child vaccination coverage to 85% or more in rural India from the current level of 50% holds great promise for reducing infant and child mortality and improving health of children. We have tested a novel strategy called Rural Effective Affordable Comprehensive Health Care (REACH) in a rural population of more than 300 000 in Rajasthan and succeeded in achieving full immunization coverage of 88.7% among children aged 12 to 23 months in a short span of less than 2 years. The REACH strategy was first developed and successfully implemented in a demonstration project by SHARE INDIA in Medchal region of Andhra Pradesh, and was then replicated in Rajgarh block of Rajasthan in cooperation with Bhoruka Charitable Trust (private partners of Integrated Child Development Services and National Rural Health Mission health workers in Rajgarh). The success of the REACH strategy in both Andhra Pradesh and Rajasthan suggests that it could be successfully adopted as a model to enhance vaccination coverage dramatically in other areas of rural India. PMID:29359630
Return on Investment in Electronic Health Records in Primary Care Practices: A Mixed-Methods Study
Sanche, Steven
2014-01-01
Background The use of electronic health records (EHR) in clinical settings is considered pivotal to a patient-centered health care delivery system. However, uncertainty in cost recovery from EHR investments remains a significant concern in primary care practices. Objective Guided by the question of “When implemented in primary care practices, what will be the return on investment (ROI) from an EHR implementation?”, the objectives of this study are two-fold: (1) to assess ROI from EHR in primary care practices and (2) to identify principal factors affecting the realization of positive ROI from EHR. We used a break-even point, that is, the time required to achieve cost recovery from an EHR investment, as an ROI indicator of an EHR investment. Methods Given the complexity exhibited by most EHR implementation projects, this study adopted a retrospective mixed-method research approach, particularly a multiphase study design approach. For this study, data were collected from community-based primary care clinics using EHR systems. Results We collected data from 17 primary care clinics using EHR systems. Our data show that the sampled primary care clinics recovered their EHR investments within an average period of 10 months (95% CI 6.2-17.4 months), seeing more patients with an average increase of 27% in the active-patients-to-clinician-FTE (full time equivalent) ratio and an average increase of 10% in the active-patients-to-clinical-support-staff-FTE ratio after an EHR implementation. Our analysis suggests, with a 95% confidence level, that the increase in the number of active patients (P=.006), the increase in the active-patients-to-clinician-FTE ratio (P<.001), and the increase in the clinic net revenue (P<.001) are positively associated with the EHR implementation, likely contributing substantially to an average break-even point of 10 months. Conclusions We found that primary care clinics can realize a positive ROI with EHR. Our analysis of the variances in the time required to achieve cost recovery from EHR investments suggests that a positive ROI does not appear automatically upon implementing an EHR and that a clinic’s ability to leverage EHR for process changes seems to play a role. Policies that provide support to help primary care practices successfully make EHR-enabled changes, such as support of clinic workflow optimization with an EHR system, could facilitate the realization of positive ROI from EHR in primary care practices. PMID:25600508
Conrad, Douglas; Fishman, Paul; Grembowski, David; Ralston, James; Reid, Robert; Martin, Diane; Larson, Eric; Anderson, Melissa
2008-10-01
To estimate the joint effect of a multifaceted access intervention on primary care physician (PCP) productivity in a large, integrated prepaid group practice. Administrative records of physician characteristics, compensation and full-time equivalent (FTE) data, linked to enrollee utilization and cost information. Dependent measures per quarter per FTE were office visits, work relative value units (WRVUs), WRVUs per visit, panel size, and total cost per member per quarter (PMPQ), for PCPs employed >0.25 FTE. General estimating equation regression models were included provider and enrollee characteristics. Panel size and RVUs per visit rose, while visits per FTE and PMPQ cost declined significantly between baseline and full implementation. Panel size rose and visits per FTE declined from baseline through rollout and full implementation. RVUs per visit and RVUs per FTE first declined, and then increased, for a significant net increase of RVUs per visit and an insignificant rise in RVUs per FTE between baseline and full implementation. PMPQ cost rose between baseline and rollout and then declined, for a significant overall decline between baseline and full implementation. This organization-wide access intervention was associated with improvements in several dimensions in PCP productivity and gains in clinical efficiency.
Turcotte-Tremblay, Anne-Marie; Spagnolo, Jessica; De Allegri, Manuela; Ridde, Valéry
2016-12-01
Governments of low- and middle-income countries (LMICs) are widely implementing performance-based financing (PBF) to improve healthcare services. However, it is unclear whether PBF provides good value for money compared to status quo or other interventions aimed at strengthening the healthcare system in LMICs. The objective of this systematic review is to identify and synthesize the existing literature that examines whether PBF represents an efficient manner of investing resources. We considered PBF to be efficient when improved care quality or quantity was achieved with equal or lower costs, or alternatively, when the same quality of care was achieved using less financial resources. A manual search of the reference lists of two recent systematic reviews on economic evaluations of PBF was conducted to identify articles that met our inclusion and exclusion criteria. Subsequently, a search strategy was developed with the help of a librarian. The following databases and search engines were used: PubMed, EconLit, Google Scholar and Google. Experts on economic evaluations were consulted for validation of the selected studies. A total of seven articles from five LMICs were selected for this review. We found the overall strength of the evidence to be weak. None of the articles were full economic evaluations; they did not make clear connections between the costs and effects of PBF. Only one study reported using a randomized controlled trial, but issues with the randomization procedure were reported. Important alternative interventions to strengthen the capacities of the healthcare system have not been considered. Few studies examined the costs and consequences of PBF in the long term. Important costs and consequences were omitted from the evaluations. Few LMICs are represented in the literature, despite wide implementation. Lastly, most articles had at least one author employed by an organization involved in the implementation of PBF, thereby resulting in potential conflicts of interest. Stronger empirical evidence on whether PBF represents good value for money in LMICs is needed.
The next phase in professional services research: From implementation to sustainability.
Crespo-Gonzalez, Carmen; Garcia-Cardenas, Victoria; Benrimoj, Shalom I
The provision of professional pharmacy services has been heralded as the professional and the economic future of pharmacy. There are different phases involved in a service creation including service design, impact evaluation, implementation and sustainability. The two first phases have been subject to extensive research. In the last years the principles of Implementation science have been applied in pharmacy to study the initial uptake and integration of evidence-based services into routine practice. However, little attention has been paid to the sustainability of those services, during which there is a continued use of the service previously implemented to achieve and sustain long-term outcomes. The objective of this commentary is to describe the differences and common characteristics between the implementation and the sustainability phase and to propose a definition for pharmacy. A literature search was performed. Four critical elements were identified: 1. The aim of the implementation phase is to incorporate new services into practice, the sustainability phase's aim is to make the services routine to achieve and sustain long-term benefits 2. At the implementation phase planned activities are used as a process to integrate the new service, at the sustainability phase there is a continuous improvement of the service 3. The implementation phase occurs during the period of time between the adoption of a service and its integration. Some authors suggest the sustainability phase is a concomitant phase with the implementation phase and others suggest it is independent 4. There is a lack of consensus regarding the duration of each phase. The following definition of sustainability for pharmacy services is proposed: "Sustainability is a phase in the process of a professional pharmacy service, in which the service previously integrated into practice during the implementation phase is routinized and institutionalized over time to achieve and sustain the expected service outcomes". An agreement on a definition will facilitate an understanding of when the profession has reached this ultimate goal. Copyright © 2017 Elsevier Inc. All rights reserved.
Measuring Impacts of Pollution Prevention
Applicants that are awarded EPA Pollution Prevention Grants must report the environmental and economic benefits achieved across all media from the implementation of the grant, which helps achieve EPA’s P2 targets.
Complexity analysis of the cost effectiveness of PI-led NASA science missions
NASA Astrophysics Data System (ADS)
Yoshida, J.; Cowdin, M.; Mize, T.; Kellogg, R.; Bearden, D.
For the last 20 years, NASA has allowed Principal Investigators (PIs) to manage the development of many unmanned space projects. Advocates of PI-led projects believe that a PI-led implementation can result in a project being developed at lower cost and shorter schedule than other implementation modes. This paper seeks to test this hypothesis by comparing the actual costs of NASA and other comparable projects developed under different implementation modes. The Aerospace Corporation's Complexity-Based Risk Assessment (CoBRA) analysis tool is used to normalize the projects such that the cost can be compared for equivalent project complexities. The data is examined both by complexity and by launch year. Cost growth will also be examined for any correlation with implementation mode. Defined in many NASA Announcements of Opportunity (AOs), a PI-led project is characterized by a central, single person with full responsibility for assembling a team and for the project's scientific integrity and the implementation and integrity of all other aspects of the mission, while operating under a cost cap. PIs have larger degrees of freedom to achieve the stated goals within NASA guidelines and oversight. This study leverages the definitions and results of previous National Research Council studies of PI-led projects. Aerospace has defined a complexity index, derived from mission performance, mass, power, and technology choices, to arrive at a broad representation of missions for purposes of comparison. Over a decade of research has established a correlation between mission complexity and spacecraft development cost and schedule. This complexity analysis, CoBRA, is applied to compare a PI-led set of New Frontiers, Discovery, Explorers, and Earth System Science Pathfinder missions to the overall NASA mission dataset. This reveals the complexity trends against development costs, cost growth, and development era.
Fifteen years of sector-wide approach (SWAp) in Bangladesh health sector: an assessment of progress
Ahsan, Karar Zunaid; Streatfield, Peter Kim; Ijdi, Rashida -E-; Escudero, Gabriela Maria; Khan, Abdul Waheed; Reza, M M
2016-01-01
The Ministry of Health and Family Welfare (MOHFW) of the Government of Bangladesh embarked on a sector-wide approach (SWAp) modality for the health, nutrition and population (HNP) sector in 1998. This programmatic shift initiated a different set of planning disciplines and practices along with institutional changes in the MOHFW. Over the years, the SWAp modality has evolved in Bangladesh as the MOHFW has learnt from its implementation and refined the program design. This article explores the progress made, both in terms of achievement of health outcomes and systems strengthening results, since the implementation of the SWAp for Bangladesh’s health sector. Secondary analyses of survey data from 1993 to 2011 as well as a literature review of published and grey literature on health SWAp in Bangladesh was conducted for this assessment. Results of the assessment indicate that the MOHFW made substantial progress in health outcomes and health systems strengthening. SWAps facilitated the alignment of funding and technical support around national priorities, and improved the government’s role in program design as well as in implementation and development partner coordination. Notable systemic improvements have taken place in the country systems with regards to monitoring and evaluation, procurement and service provision, which have improved functionality of health facilities to provide essential care. Implementation of the SWAp has, therefore, contributed to an accelerated improvement in key health outcomes in Bangladesh over the last 15 years. The health SWAp in Bangladesh offers an example of a successful adaptation of such an approach in a complex administrative structure. Based on the lessons learned from SWAp implementation in Bangladesh, the MOHFW needs to play a stronger stewardship and regulatory role to reap the full benefits of a SWAp in its subsequent programming. PMID:26582744
Marketing Internships: A Planning and Implementation Guide.
ERIC Educational Resources Information Center
Faught, Suzanne G.
This planning and implementation guide is designed to assist marketing educators and others supportive of marketing education. It begins with definitions of vocabulary of related terminology and descriptions of the four models of internships presented in the guide: full-year, rotation-type format; 1-semester, rotation-type format; full-year format…
Large area full-field optical coherence tomography using white light source
NASA Astrophysics Data System (ADS)
Chang, Shoude; Mao, Youxin; Sherif, Sherif; Flueraru, Costel
2007-06-01
Optical coherence tomography (OCT) is an emerging technology for high-resolution cross-sectional imaging of 3D structures. Not only could OCT extract the internal features of an object, but it could acquire the 3D profile of an object as well. Hence it has huge potentials for industrial applications. Owing to non-scanning along the X-Y axis, full-field OCT could be the simplest and most economic imaging system, especially for applications where the speed is critical. For an OCT system, the performance and cost basically depends on the light source being used. The broader the source bandwidth, the finer of the depth resolution that could be reached; the more power of the source, the better signal-to-noise ratio and the deeper of penetration the system achieves. A typical SLD (Superluminescent Diode) light source has a bandwidth of 15 nm and 10 mW optical power at a price around 6,000. However, a Halogen bulb having 50W power and 200nm bandwidth only costs less than 10. The design and implementation of a large-area, full-field OCT system using Halogen white-light source is described in the paper. The experimental results obtained from 3D shaping and multiple-layer tomographies are also presented.
Image management within a PACS
NASA Astrophysics Data System (ADS)
Glicksman, Robert A.; Prior, Fred W.; Wilson, Dennis L.
1993-09-01
The full benefits of a PACS system cannot be achieved by a departmental system, as films must still be made to service referring physicians and clinics. Therefore, a full hospital PACS must provide workstations throughout the hospital which are connected to the central file server and database, but which present `clinical' views of radiological data. In contrast to the radiologist, the clinician needs to select examinations from a `patient list' which presents the results of his/her radiology referrals. The most important data for the clinician is the radiology report, which must be immediately available upon selection of the examination. The images themselves, perhaps with annotations provided by the reading radiologist, must also be available in a few seconds from selection. Furthermore, the ability to display radiologist selected relevant historical images along with the new examination is necessary in those instances where the radiologist felt that certain historical images were important in the interpretation and diagnosis of the patient. Therefore, views of the new and historical data along clinical lines, conference preparation features, and modality and body part specific selections are also required to successfully implement a full hospital PACS. This paper describes the concepts for image selection and presentation at PACS workstations, both `diagnostic' workstations within the radiology department and `clinical' workstations which support the rest of the hospital and outpatient clinics.
Local spatiotemporal time-frequency peak filtering method for seismic random noise reduction
NASA Astrophysics Data System (ADS)
Liu, Yanping; Dang, Bo; Li, Yue; Lin, Hongbo
2014-12-01
To achieve a higher level of seismic random noise suppression, the Radon transform has been adopted to implement spatiotemporal time-frequency peak filtering (TFPF) in our previous studies. Those studies involved performing TFPF in full-aperture Radon domain, including linear Radon and parabolic Radon. Although the superiority of this method to the conventional TFPF has been tested through processing on synthetic seismic models and field seismic data, there are still some limitations in the method. Both full-aperture linear Radon and parabolic Radon are applicable and effective for some relatively simple situations (e.g., curve reflection events with regular geometry) but inapplicable for complicated situations such as reflection events with irregular shapes, or interlaced events with quite different slope or curvature parameters. Therefore, a localized approach to the application of the Radon transform must be applied. It would serve the filter method better by adapting the transform to the local character of the data variations. In this article, we propose an idea that adopts the local Radon transform referred to as piecewise full-aperture Radon to realize spatiotemporal TFPF, called local spatiotemporal TFPF. Through experiments on synthetic seismic models and field seismic data, this study demonstrates the advantage of our method in seismic random noise reduction and reflection event recovery for relatively complicated situations of seismic data.
NASA Astrophysics Data System (ADS)
Berdalovic, I.; Bates, R.; Buttar, C.; Cardella, R.; Egidos Plaja, N.; Hemperek, T.; Hiti, B.; van Hoorne, J. W.; Kugathasan, T.; Mandic, I.; Maneuski, D.; Marin Tobon, C. A.; Moustakas, K.; Musa, L.; Pernegger, H.; Riedler, P.; Riegel, C.; Schaefer, D.; Schioppa, E. J.; Sharma, A.; Snoeys, W.; Solans Sanchez, C.; Wang, T.; Wermes, N.
2018-01-01
The upgrade of the ATLAS tracking detector (ITk) for the High-Luminosity Large Hadron Collider at CERN requires the development of novel radiation hard silicon sensor technologies. Latest developments in CMOS sensor processing offer the possibility of combining high-resistivity substrates with on-chip high-voltage biasing to achieve a large depleted active sensor volume. We have characterised depleted monolithic active pixel sensors (DMAPS), which were produced in a novel modified imaging process implemented in the TowerJazz 180 nm CMOS process in the framework of the monolithic sensor development for the ALICE experiment. Sensors fabricated in this modified process feature full depletion of the sensitive layer, a sensor capacitance of only a few fF and radiation tolerance up to 1015 neq/cm2. This paper summarises the measurements of charge collection properties in beam tests and in the laboratory using radioactive sources and edge TCT. The results of these measurements show significantly improved radiation hardness obtained for sensors manufactured using the modified process. This has opened the way to the design of two large scale demonstrators for the ATLAS ITk. To achieve a design compatible with the requirements of the outer pixel layers of the tracker, a charge sensitive front-end taking 500 nA from a 1.8 V supply is combined with a fast digital readout architecture. The low-power front-end with a 25 ns time resolution exploits the low sensor capacitance to reduce noise and analogue power, while the implemented readout architectures minimise power by reducing the digital activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kosierb, Rick
The Canadian Nuclear Safety Commission (CNSC) regulates the use of nuclear energy and materials to protect the health, safety and security of Canadians and the environment, and to implement Canada's international obligations on the peaceful use of nuclear energy. In order to perform this regulatory activity, the CNSC issue licences and has its staff perform inspections to verify conformity to the aspects of those licences. Within the CNSC, the Accelerators and Class II Facilities Division (ACFD) is responsible for the regulatory oversight of Class II Prescribed Equipment used in medical, academic, and industrial sectors in Canada. In performing inspections, ACFDmore » has encountered licensees with practices that are either below, meet or exceed regulatory expectations in specific areas. Unfortunately, none of these practices are ever communicated to the broader Class II community to help other licensees avoid the same problem or achieve high standards. In this poster, ACFD will highlight safety practices that go beyond expectations. These practices are taken from observations during site inspections between 2007 and 2013 and will be presented in six areas: Procedures, Participation, Awareness, Equipment, Servicing and Software. Each area briefly discusses a number of practices that the CNSC feels went beyond the expectations dictated by the licence. Where possible, names are added of the contact people at the centres who can be reached for full details of their implementations. It is hoped that this communication will assist other licensees to achieve these same high levels of compliance and possibly go beyond.« less
Constitutional dynamic chemistry: bridge from supramolecular chemistry to adaptive chemistry.
Lehn, Jean-Marie
2012-01-01
Supramolecular chemistry aims at implementing highly complex chemical systems from molecular components held together by non-covalent intermolecular forces and effecting molecular recognition, catalysis and transport processes. A further step consists in the investigation of chemical systems undergoing self-organization, i.e. systems capable of spontaneously generating well-defined functional supramolecular architectures by self-assembly from their components, thus behaving as programmed chemical systems. Supramolecular chemistry is intrinsically a dynamic chemistry in view of the lability of the interactions connecting the molecular components of a supramolecular entity and the resulting ability of supramolecular species to exchange their constituents. The same holds for molecular chemistry when the molecular entity contains covalent bonds that may form and break reversibility, so as to allow a continuous change in constitution by reorganization and exchange of building blocks. These features define a Constitutional Dynamic Chemistry (CDC) on both the molecular and supramolecular levels.CDC introduces a paradigm shift with respect to constitutionally static chemistry. The latter relies on design for the generation of a target entity, whereas CDC takes advantage of dynamic diversity to allow variation and selection. The implementation of selection in chemistry introduces a fundamental change in outlook. Whereas self-organization by design strives to achieve full control over the output molecular or supramolecular entity by explicit programming, self-organization with selection operates on dynamic constitutional diversity in response to either internal or external factors to achieve adaptation.The merging of the features: -information and programmability, -dynamics and reversibility, -constitution and structural diversity, points to the emergence of adaptive and evolutive chemistry, towards a chemistry of complex matter.
Practical Approaches for Achieving Integrated Behavioral Health Care in Primary Care Settings
Ratzliff, Anna; Phillips, Kathryn E.; Sugarman, Jonathan R.; Unützer, Jürgen; Wagner, Edward H.
2016-01-01
Behavioral health problems are common, yet most patients do not receive effective treatment in primary care settings. Despite availability of effective models for integrating behavioral health care in primary care settings, uptake has been slow. The Behavioral Health Integration Implementation Guide provides practical guidance for adapting and implementing effective integrated behavioral health care into patient-centered medical homes. The authors gathered input from stakeholders involved in behavioral health integration efforts: safety net providers, subject matter experts in primary care and behavioral health, a behavioral health patient and peer specialist, and state and national policy makers. Stakeholder input informed development of the Behavioral Health Integration Implementation Guide and the GROW Pathway Planning Worksheet. The Behavioral Health Integration Implementation Guide is model neutral and allows organizations to take meaningful steps toward providing integrated care that achieves access and accountability. PMID:26698163
Practical Approaches for Achieving Integrated Behavioral Health Care in Primary Care Settings.
Ratzliff, Anna; Phillips, Kathryn E; Sugarman, Jonathan R; Unützer, Jürgen; Wagner, Edward H
Behavioral health problems are common, yet most patients do not receive effective treatment in primary care settings. Despite availability of effective models for integrating behavioral health care in primary care settings, uptake has been slow. The Behavioral Health Integration Implementation Guide provides practical guidance for adapting and implementing effective integrated behavioral health care into patient-centered medical homes. The authors gathered input from stakeholders involved in behavioral health integration efforts: safety net providers, subject matter experts in primary care and behavioral health, a behavioral health patient and peer specialist, and state and national policy makers. Stakeholder input informed development of the Behavioral Health Integration Implementation Guide and the GROW Pathway Planning Worksheet. The Behavioral Health Integration Implementation Guide is model neutral and allows organizations to take meaningful steps toward providing integrated care that achieves access and accountability.
Zhang, Baofeng; Kilburg, Denise; Eastman, Peter; Pande, Vijay S; Gallicchio, Emilio
2017-04-15
We present an algorithm to efficiently compute accurate volumes and surface areas of macromolecules on graphical processing unit (GPU) devices using an analytic model which represents atomic volumes by continuous Gaussian densities. The volume of the molecule is expressed by means of the inclusion-exclusion formula, which is based on the summation of overlap integrals among multiple atomic densities. The surface area of the molecule is obtained by differentiation of the molecular volume with respect to atomic radii. The many-body nature of the model makes a port to GPU devices challenging. To our knowledge, this is the first reported full implementation of this model on GPU hardware. To accomplish this, we have used recursive strategies to construct the tree of overlaps and to accumulate volumes and their gradients on the tree data structures so as to minimize memory contention. The algorithm is used in the formulation of a surface area-based non-polar implicit solvent model implemented as an open source plug-in (named GaussVol) for the popular OpenMM library for molecular mechanics modeling. GaussVol is 50 to 100 times faster than our best optimized implementation for the CPUs, achieving speeds in excess of 100 ns/day with 1 fs time-step for protein-sized systems on commodity GPUs. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt Beran; John Christenson; Dragos Nica
2002-12-15
The goal of the project is to enable plant operators to detect with high sensitivity and reliability the onset of decalibration drifts in all of the instrumentation used as input to the reactor heat balance calculations. To achieve this objective, the collaborators developed and implemented at DBNPS an extension of the Multivariate State Estimation Technique (MSET) pattern recognition methodology pioneered by ANAL. The extension was implemented during the second phase of the project and fully achieved the project goal.
Time-optimal control with finite bandwidth
NASA Astrophysics Data System (ADS)
Hirose, M.; Cappellaro, P.
2018-04-01
Time-optimal control theory provides recipes to achieve quantum operations with high fidelity and speed, as required in quantum technologies such as quantum sensing and computation. While technical advances have achieved the ultrastrong driving regime in many physical systems, these capabilities have yet to be fully exploited for the precise control of quantum systems, as other limitations, such as the generation of higher harmonics or the finite response time of the control apparatus, prevent the implementation of theoretical time-optimal control. Here we present a method to achieve time-optimal control of qubit systems that can take advantage of fast driving beyond the rotating wave approximation. We exploit results from time-optimal control theory to design driving protocols that can be implemented with realistic, finite-bandwidth control fields, and we find a relationship between bandwidth limitations and achievable control fidelity.
Deng, Lei; Jiao, Peng; Pei, Jing; Wu, Zhenzhi; Li, Guoqi
2018-04-01
Although deep neural networks (DNNs) are being a revolutionary power to open up the AI era, the notoriously huge hardware overhead has challenged their applications. Recently, several binary and ternary networks, in which the costly multiply-accumulate operations can be replaced by accumulations or even binary logic operations, make the on-chip training of DNNs quite promising. Therefore there is a pressing need to build an architecture that could subsume these networks under a unified framework that achieves both higher performance and less overhead. To this end, two fundamental issues are yet to be addressed. The first one is how to implement the back propagation when neuronal activations are discrete. The second one is how to remove the full-precision hidden weights in the training phase to break the bottlenecks of memory/computation consumption. To address the first issue, we present a multi-step neuronal activation discretization method and a derivative approximation technique that enable the implementing the back propagation algorithm on discrete DNNs. While for the second issue, we propose a discrete state transition (DST) methodology to constrain the weights in a discrete space without saving the hidden weights. Through this way, we build a unified framework that subsumes the binary or ternary networks as its special cases, and under which a heuristic algorithm is provided at the website https://github.com/AcrossV/Gated-XNOR. More particularly, we find that when both the weights and activations become ternary values, the DNNs can be reduced to sparse binary networks, termed as gated XNOR networks (GXNOR-Nets) since only the event of non-zero weight and non-zero activation enables the control gate to start the XNOR logic operations in the original binary networks. This promises the event-driven hardware design for efficient mobile intelligence. We achieve advanced performance compared with state-of-the-art algorithms. Furthermore, the computational sparsity and the number of states in the discrete space can be flexibly modified to make it suitable for various hardware platforms. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, B; Southern Medical University, Guangzhou, Guangdong; Tian, Z
Purpose: While compressed sensing-based cone-beam CT (CBCT) iterative reconstruction techniques have demonstrated tremendous capability of reconstructing high-quality images from undersampled noisy data, its long computation time still hinders wide application in routine clinic. The purpose of this study is to develop a reconstruction framework that employs modern consensus optimization techniques to achieve CBCT reconstruction on a multi-GPU platform for improved computational efficiency. Methods: Total projection data were evenly distributed to multiple GPUs. Each GPU performed reconstruction using its own projection data with a conventional total variation regularization approach to ensure image quality. In addition, the solutions from GPUs were subjectmore » to a consistency constraint that they should be identical. We solved the optimization problem with all the constraints considered rigorously using an alternating direction method of multipliers (ADMM) algorithm. The reconstruction framework was implemented using OpenCL on a platform with two Nvidia GTX590 GPU cards, each with two GPUs. We studied the performance of our method and demonstrated its advantages through a simulation case with a NCAT phantom and an experimental case with a Catphan phantom. Result: Compared with the CBCT images reconstructed using conventional FDK method with full projection datasets, our proposed method achieved comparable image quality with about one third projection numbers. The computation time on the multi-GPU platform was ∼55 s and ∼ 35 s in the two cases respectively, achieving a speedup factor of ∼ 3.0 compared with single GPU reconstruction. Conclusion: We have developed a consensus ADMM-based CBCT reconstruction method which enabled performing reconstruction on a multi-GPU platform. The achieved efficiency made this method clinically attractive.« less
Translating Policy to Practice: Initiating RTI in Urban Schools
ERIC Educational Resources Information Center
Dougherty Stahl, Katherine A.; Keane, Annette E.; Simic, Ognjen
2013-01-01
This mixed methods study explores the pilot implementation of a Response to Intervention framework in the first grade classrooms in three urban schools. Two schools in a fully implemented condition (FI) with a facilitator and a partially implemented condition (PI) without a facilitator were investigated using student achievement data, field notes,…
It's Not the Law--It's the Implementation That Matters.
ERIC Educational Resources Information Center
Vergon, Chuck; Broderick, Lauren
This short report--part of a collection of 54 papers from the 48th annual conference of the Education Law Association held in November 2002--discusses implementation of inclusion policy. Specifically, it reports on a study of administrative strategies and organizational processes that promote policy implementation and achievement of the objectives…
Leadership: The Key to Successful Implementation of Total Quality Management
1990-05-01
the implementation of the initiative called Total Quality Management as the philosophy and guiding principles to improve organizational efficiency...where and how to start. This paper presents the critical elements, their interrelationships, and how they can be used to achieve the cultural change necessary for successful implementation of Total Quality Management .
Service Learning for At-Risk Student Populations: The Contextual Dynamism of Implementation
ERIC Educational Resources Information Center
Akin, Jacob T.; Vesely, Randall S.
2016-01-01
The central purpose of this article is to explore research, issues, and perspectives on the implementation of service learning programs to improve student achievement in at-risk student populations. The implementation of service learning programs takes place within multiple contexts and across several terrains. The complexities of implementing…
ERIC Educational Resources Information Center
Tushnet, Naida C., Flaherty, John, Jr., Smith, And
2004-01-01
The Longitudinal Assessment of Comprehensive School Reform Implementation and Outcomes (LACIO) responds to the No Child Left Behind Act's requirement for an evaluation of the federal Comprehensive School Reform (CSR) program. The legislation stipulates two broad goals for the evaluation: (1) to evaluate the implementation and outcomes achieved by…
Implementing Nunavut Education Act: Compulsory School Attendance Policy
ERIC Educational Resources Information Center
Kwarteng, E. Fredua
2006-01-01
This paper discusses the implementation of Nunavut compulsory school attendance policy as part of the Nunavut Education Act (2002). Using a bottom-up approach to policy implementation in the literature and the author's six years teaching experience in Nunavut, the paper argues that the compulsory school attendance policy may not achieve its…
Teachers' Reflections on Cooperative Learning: Issues of Implementation
ERIC Educational Resources Information Center
Gillies, Robyn M.; Boyle, Michael
2010-01-01
Cooperative learning (CL) is a well documented pedagogical practice that promotes academic achievement and socialization, yet many teachers struggle with implementing it in their classes. This study reports on the perceptions of 10, middle-year teachers who implemented cooperative learning in a unit of work across two school terms. Data from the…
ERIC Educational Resources Information Center
Mincic, Melissa; Smith, Barbara J.; Strain, Phil
2009-01-01
Implementing the Pyramid Model with fidelity and achieving positive outcomes for children and their families requires that administrators understand their roles in the implementation process. Every administrative decision impacts program quality and sustainability. This Policy Brief underscores the importance of facilitative administrative…
ERIC Educational Resources Information Center
Wang, Margaret C., Ed.
The studies in this monograph are designed to examine the implementation processes of an innovative instructional program and the relationship between the implementation process and the achievement of certain program goals in school settings. The monograph is a contribution to the technical aspects of designing and implementing innovative…
A Framework for Identifying Implementation Issues Affecting Extension Human Sciences Programming
ERIC Educational Resources Information Center
Abell, Ellen; Cummings, Rebekah; Duke, Adrienne M.; Marshall, Jennifer Wells
2015-01-01
Extension programs based on identified needs, relevant theory, and solid research too often fail to realize their objectives. Program implementation is acknowledged to contribute to program effectiveness, yet systematic attention has not been paid to the array of implementation issues that can complicate achieving program goals. We developed the…
Implementing PlanCheyenne: Strategies and Opportunities for Smarter Growth in Cheyenne
This report is from a technical assistance project with Cheyenne, WY, to identify policy options that would implement PlanCheyenne and illustrate development that would help to achieve the community's goals.
Impacts and benefits of implementing BIM on bridge infrastructure projects.
DOT National Transportation Integrated Search
2014-10-01
To date, BIM (Building Information Modeling) is not widely utilized in infrastructure asset management. : Benefits achieved through implementation in vertical construction, however, suggest that BIM represents : significant opportunity for gains in p...
A hybrid method for evaluating enterprise architecture implementation.
Nikpay, Fatemeh; Ahmad, Rodina; Yin Kia, Chiam
2017-02-01
Enterprise Architecture (EA) implementation evaluation provides a set of methods and practices for evaluating the EA implementation artefacts within an EA implementation project. There are insufficient practices in existing EA evaluation models in terms of considering all EA functions and processes, using structured methods in developing EA implementation, employing matured practices, and using appropriate metrics to achieve proper evaluation. The aim of this research is to develop a hybrid evaluation method that supports achieving the objectives of EA implementation. To attain this aim, the first step is to identify EA implementation evaluation practices. To this end, a Systematic Literature Review (SLR) was conducted. Second, the proposed hybrid method was developed based on the foundation and information extracted from the SLR, semi-structured interviews with EA practitioners, program theory evaluation and Information Systems (ISs) evaluation. Finally, the proposed method was validated by means of a case study and expert reviews. This research provides a suitable foundation for researchers who wish to extend and continue this research topic with further analysis and exploration, and for practitioners who would like to employ an effective and lightweight evaluation method for EA projects. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vilcot, J.-P.; Ayachi, B.; Aviles, T.; Miska, P.
2017-11-01
In the first part of this paper, we will show that a sputtering-based fabrication process exhibiting a low environmental footprint has been developed for the fabrication of copper indium gallium selenide (CIGS) absorbing material. Its originality lies in using room temperature sputtering in a pulsed—direct current mode of a single quaternary target followed by a post-anneal. At any stage of the process, selenium or sulfur atmosphere is used. Inert gas is used, respectively argon and a forming gas, for the deposition and annealing step, respectively. CIGS cells have been fabricated using such an absorbing layer. They exhibit an efficiency close to 12%. A tandem cell approach, using a thin film technology in conjunction with the well-established Si technology, is a promising technique, achieving cells with 30%, and higher, efficiency. Such cells are awaited, jointly with a stronger implementation of low environmental footprint technologies, as a vision for 2030. In the first section, sputtering technique has shown its ability to be developed in such a way achieving an environmentally friendly process that can be moreover compatible to be co-integrated with, for example, Si technology. In a second section, we will present a prospective discussion on the materials that can be applied to produce a sustainable approach for such a tandem cell configuration.
Røttingen, John-Arne
2012-01-01
Abstract The Member States of the World Health Organization (WHO) are currently debating the substance and form of an international agreement to improve the financing and coordination of research and development (R&D) for health products that meet the needs of developing countries. In addition to considering the content of any possible legal or political agreement, Member States may find it helpful to reflect on the full range of implementation mechanisms available to bring any agreement into effect. These include mechanisms for states to make commitments, administer activities, manage financial contributions, make subsequent decisions, monitor each other’s performance and promote compliance. States can make binding or non-binding commitments through conventions, contracts, declarations or institutional reforms. States can administer activities to implement their agreements through international organizations, sub-agencies, joint ventures or self-organizing processes. Finances can be managed through specialized multilateral funds, financial institutions, membership organizations or coordinated self-management. Decisions can be made through unanimity, consensus, equal voting, modified voting or delegation. Oversight can be provided by peer review, expert review, self-reports or civil society. Together, states should select their preferred options across categories of implementation mechanisms, each of which has advantages and disadvantages. The challenge lies in choosing the most effective combinations of mechanisms for supporting an international agreement (or set of agreements) that achieves collective aspirations in a way and at a cost that are both sustainable and acceptable to those involved. In making these decisions, WHO’s Member States can benefit from years of experience with these different mechanisms in health and its related sectors. PMID:23226898
Alimohammadi, Mahmood; Jafari-Mansoorian, Hossein; Hashemi, Seyed Yaser; Momenabadi, Victoria; Ghasemi, Seyed Mehdi; Karimyan, Kamaladdin
2017-07-01
Smoking is the largest preventable cause of death in the world, killing nearly 6 million people annually. This article is an investigation of measures implemented laws in the Iran to study the proposed strategy of control and reduce tobacco use based on the monitor, protect, offer, warn, enforce and raise (MPOWER) policy. All laws approved by the Parliament along with the instructions on tobacco control prepared by the Ministry of Health and Medical Education, Ministry of Industry, Mine and Trade were collected and studied. Moreover, practical steps of Ministry of Health and other organizations were examined in this regard. Iranian Parliament after the adoption of the Framework Convention on Tobacco Control (FCTC) acts to create a comprehensive and systematic program for tobacco control legislation as a first step towards comprehensive national tobacco control and combat. In this law and its implementing guidelines and based on the strategy of MPOWER, specific implement is done to monitor tobacco use and prevention policies, protect people from tobacco smoke, offer help to quit tobacco use, warn about the dangers of tobacco, enforce bans on tobacco advertising, promotion and sponsorship and raise taxes on tobacco. However, the full objectives of the legislation have not achieved yet. According to Iran's membership in the FCTC and executive producer of tobacco control laws and regulations, necessary infrastructure is ready for a serious fight with tobacco use. In Iran, in comparison with developed countries, there is a huge gap between ratified laws and performing of laws.
Hoffman, Steven J; Røttingen, John-Arne
2012-11-01
The Member States of the World Health Organization (WHO) are currently debating the substance and form of an international agreement to improve the financing and coordination of research and development (R&D) for health products that meet the needs of developing countries. In addition to considering the content of any possible legal or political agreement, Member States may find it helpful to reflect on the full range of implementation mechanisms available to bring any agreement into effect. These include mechanisms for states to make commitments, administer activities, manage financial contributions, make subsequent decisions, monitor each other's performance and promote compliance. States can make binding or non-binding commitments through conventions, contracts, declarations or institutional reforms. States can administer activities to implement their agreements through international organizations, sub-agencies, joint ventures or self-organizing processes. Finances can be managed through specialized multilateral funds, financial institutions, membership organizations or coordinated self-management. Decisions can be made through unanimity, consensus, equal voting, modified voting or delegation. Oversight can be provided by peer review, expert review, self-reports or civil society. Together, states should select their preferred options across categories of implementation mechanisms, each of which has advantages and disadvantages. The challenge lies in choosing the most effective combinations of mechanisms for supporting an international agreement (or set of agreements) that achieves collective aspirations in a way and at a cost that are both sustainable and acceptable to those involved. In making these decisions, WHO's Member States can benefit from years of experience with these different mechanisms in health and its related sectors.
Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.
Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert
2017-08-01
Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.
Routine Microsecond Molecular Dynamics Simulations with AMBER on GPUs. 1. Generalized Born
2012-01-01
We present an implementation of generalized Born implicit solvent all-atom classical molecular dynamics (MD) within the AMBER program package that runs entirely on CUDA enabled NVIDIA graphics processing units (GPUs). We discuss the algorithms that are used to exploit the processing power of the GPUs and show the performance that can be achieved in comparison to simulations on conventional CPU clusters. The implementation supports three different precision models in which the contributions to the forces are calculated in single precision floating point arithmetic but accumulated in double precision (SPDP), or everything is computed in single precision (SPSP) or double precision (DPDP). In addition to performance, we have focused on understanding the implications of the different precision models on the outcome of implicit solvent MD simulations. We show results for a range of tests including the accuracy of single point force evaluations and energy conservation as well as structural properties pertainining to protein dynamics. The numerical noise due to rounding errors within the SPSP precision model is sufficiently large to lead to an accumulation of errors which can result in unphysical trajectories for long time scale simulations. We recommend the use of the mixed-precision SPDP model since the numerical results obtained are comparable with those of the full double precision DPDP model and the reference double precision CPU implementation but at significantly reduced computational cost. Our implementation provides performance for GB simulations on a single desktop that is on par with, and in some cases exceeds, that of traditional supercomputers. PMID:22582031
Ngwenya, Solwayo
2017-07-06
Stillbirths are distressing to the parents and healthcare workers. Globally large numbers of babies are stillborn. A number of strategies have been implemented to try and reduce stillbirths worldwide. The objective of this study was to assess the impact of leadership and accountability changes on reducing full term intrapartum stillbirths. Leadership and accountability changes were implemented in January 2016. This retrospective cohort study was carried out to assess the impact of the changes on fresh full term intrapartum stillbirths covering the period 6 months prior to the implementation date and 12 months after the implementation date. The changes included leadership and accountability. Fresh full term stillbirths (>37 weeks gestation) occurring during the intrapartum stage of labour were analysed to see if there would be any reduction in numbers after the measures were put in place. There was a reduction in the number of fresh full term intrapartum stillbirths after the introduction of the measures. There was a statistical difference before and after implementation of the changes, 50% vs 0%, P = 0.025. There was a reduction in the time it took to perform an emergency caesarean section from a mean of 30 to 15 min by the end of the study, a 50% reduction. Clear and consistent clinical leadership and accountability can help in the global attempts to reduce stillbirth figures. Simple measures can contribute to improving perinatal outcomes.
Transparent Seismic Mitigation for Community Resilience
NASA Astrophysics Data System (ADS)
Poland, C. D.; Pekelnicky, R.
2008-12-01
Healthy communities continuously grow by leveraging their intellectual capital to drive economic development while protecting their cultural heritage. Success, in part, depends on the support of a healthy built environment that is rooted in contemporary urban planning, sustainability and disaster resilience. Planners and policy makers are deeply concerned with all aspects of their communities, including its seismic safety. Their reluctance to implement the latest plans for achieving seismic safety is rooted in a misunderstanding of the hazard they face and the risk it poses to their built environment. Probabilistic lingo and public debate about how big the "big one" will be drives them to resort to their own experience and intuition. There is a fundamental lack of transparency related to what is expected to happen, and it is partially blocking the policy changes that are needed. The solution: craft the message in broad based, usable terms that name the hazard, defines performance, and establishes a set of performance goals that represent the resiliency needed to drive a community's natural ability to rebound from a major seismic event. By using transparent goals and measures with an intuitive vocabulary for both performance and hazard, earthquake professionals, working with the San Francisco Urban Planning and Research Association (SPUR), have defined a level of resiliency that needs to be achieved by the City of San Francisco to assure their response to an event will be manageable and full recovery achievable within three years. Five performance measures for buildings and three for lifeline systems have been defined. Each declares whether people will be safe inside, whether the building will be able to be repaired and whether they will be usable during repairs. Lifeline systems are further defined in terms of the time intervals to restore 90%, 95%, and full service. These transparent categories are used in conjunction with the expected earthquake level to describe the standards needed for new buildings and lifelines and the rehabilitation programs needed for existing buildings and systems. Earthquake professionals -- Emergency Response Planners, Earth Scientists, and Earthquake Engineers - need to embrace this level of transparency and work with their communities to craft the policies needed to instill change and achieve disaster resilience.
Ghebrehewet, Sam; Thorrington, Dominic; Farmer, Siobhan; Kearney, James; Blissett, Deidre; McLeod, Hugh; Keenan, Alex
2016-04-04
Measles is a highly contagious vaccine-preventable infection that caused large outbreaks in England in 2012 and 2013 in areas which failed to achieve herd protection levels (95%) consistently. We sought to quantify the economic costs associated with the 2012-13 Merseyside measles outbreak, relative to the cost of extending preventative vaccination to secure herd protection. A costing model based on a critical literature review was developed. A workshop and interviews were held with key stakeholders in the Merseyside outbreak to understand the pathway of a measles case and then quantify healthcare activity and costs for the main NHS providers and public health team incurred during the initial four month period to May 2012. These data were used to model the total costs of the full outbreak to August 2013, comprising those to healthcare providers for patient treatment, public health and societal productivity losses. The modelled total cost of the full outbreak was compared to the cost of extending the preventative vaccination programme to achieve herd protection. The Merseyside outbreak included 2458 reported cases. The estimated cost of the outbreak was £ 4.4m (sensitivity analysis £ 3.9 m to £ 5.2m) comprising 15% (£ 0.7 m) NHS patient treatment costs, 40% (£ 1.8m) public health costs and 44% (£ 2.0m) for societal productivity losses. In comparison, over the previous five years in Cheshire and Merseyside a further 11,793 MMR vaccinations would have been needed to achieve herd protection at an estimated cost of £ 182,909 (4% of the total cost of the measles outbreak). Failure to consistently reach MMR uptake levels of 95% across all localities and sectors (achieve herd protection) risks comparatively higher economic costs associated with the containment (including healthcare costs) and implementation of effective public health management of outbreaks. Commissioned by the Cheshire and Merseyside Public Health England Centre. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Hardware implementation of hierarchical volume subdivision-based elastic registration.
Dandekar, Omkar; Walimbe, Vivek; Shekhar, Raj
2006-01-01
Real-time, elastic and fully automated 3D image registration is critical to the efficiency and effectiveness of many image-guided diagnostic and treatment procedures relying on multimodality image fusion or serial image comparison. True, real-time performance will make many 3D image registration-based techniques clinically viable. Hierarchical volume subdivision-based image registration techniques are inherently faster than most elastic registration techniques, e.g. free-form deformation (FFD)-based techniques, and are more amenable for achieving real-time performance through hardware acceleration. Our group has previously reported an FPGA-based architecture for accelerating FFD-based image registration. In this article we show how our existing architecture can be adapted to support hierarchical volume subdivision-based image registration. A proof-of-concept implementation of the architecture achieved speedups of 100 for elastic registration against an optimized software implementation on a 3.2 GHz Pentium III Xeon workstation. Due to inherent parallel nature of the hierarchical volume subdivision-based image registration techniques further speedup can be achieved by using several computing modules in parallel.
Cost-Effectiveness of Comprehensive School Reform in Low Achieving Schools
ERIC Educational Resources Information Center
Ross, John A.; Scott, Garth; Sibbald, Tim M.
2012-01-01
We evaluated the cost-effectiveness of Struggling Schools, a user-generated approach to Comprehensive School Reform implemented in 100 low achieving schools serving disadvantaged students in a Canadian province. The results show that while Struggling Schools had a statistically significant positive effect on Grade 3 Reading achievement, d = 0.48…
ERIC Educational Resources Information Center
Clark, Rebecca
2013-01-01
This study approaches the problem of African American mathematics achievement from a strength-based perspective, identifying practices implemented by middle school principals successful in increasing and sustaining the mathematics achievement of African American students. The study was designed to answer questions regarding both school-wide…
Closing Achievement Gaps and Beyond: Teachers' Reactions to the Remedial Education Policy in Taiwan
ERIC Educational Resources Information Center
Chen, Hsiao-Lan Sharon; Yu, Patricia
2016-01-01
Educators have increasingly implemented remedial education in elementary and secondary schools throughout Taiwan as a systemic approach toward closing achievement gaps. However, students from lower socioeconomic backgrounds and those in remote areas have shown little improvement in academic achievement. This issue raises the question of how…
McNair, H A; Hafeez, S; Taylor, H; Lalondrelle, S; McDonald, F; Hansen, V N; Huddart, R
2015-04-01
The implementation of plan of the day selection for patients receiving radiotherapy (RT) for bladder cancer requires efficient and confident decision-making. This article describes the development of a training programme and maintenance of competency. Cone beam CT (CBCT) images acquired on patients receiving RT for bladder cancer were assessed to establish baseline competency and training needs. A training programme was implemented, and observers were asked to select planning target volumes (PTVs) on two groups of 20 patients' images. After clinical implementation, the PTVs chosen were reviewed offline, and an audit performed after 3 years. A mean of 73% (range, 53-93%) concordance rate was achieved prior to training. Subsequent to training, the mean score decreased to 66% (Round 1), then increased to 76% (Round 2). Six radiographers and two clinicians successfully completed the training programme. An independent observer reviewed the images offline after clinical implementation, and a 91% (126/139) concordance rate was achieved. During the audit, 125 CBCT images from 13 patients were reviewed by a single observer and concordance was 92%. Radiographer-led selection of plan of the day was implemented successfully with the use of a training programme and continual assessment. Quality has been maintained over a period of 3 years. The training programme was successful in achieving and maintaining competency for a plan of the day technique.
GPU-based Branchless Distance-Driven Projection and Backprojection
Liu, Rui; Fu, Lin; De Man, Bruno; Yu, Hengyong
2017-01-01
Projection and backprojection operations are essential in a variety of image reconstruction and physical correction algorithms in CT. The distance-driven (DD) projection and backprojection are widely used for their highly sequential memory access pattern and low arithmetic cost. However, a typical DD implementation has an inner loop that adjusts the calculation depending on the relative position between voxel and detector cell boundaries. The irregularity of the branch behavior makes it inefficient to be implemented on massively parallel computing devices such as graphics processing units (GPUs). Such irregular branch behaviors can be eliminated by factorizing the DD operation as three branchless steps: integration, linear interpolation, and differentiation, all of which are highly amenable to massive vectorization. In this paper, we implement and evaluate a highly parallel branchless DD algorithm for 3D cone beam CT. The algorithm utilizes the texture memory and hardware interpolation on GPUs to achieve fast computational speed. The developed branchless DD algorithm achieved 137-fold speedup for forward projection and 188-fold speedup for backprojection relative to a single-thread CPU implementation. Compared with a state-of-the-art 32-thread CPU implementation, the proposed branchless DD achieved 8-fold acceleration for forward projection and 10-fold acceleration for backprojection. GPU based branchless DD method was evaluated by iterative reconstruction algorithms with both simulation and real datasets. It obtained visually identical images as the CPU reference algorithm. PMID:29333480
GPU-based Branchless Distance-Driven Projection and Backprojection.
Liu, Rui; Fu, Lin; De Man, Bruno; Yu, Hengyong
2017-12-01
Projection and backprojection operations are essential in a variety of image reconstruction and physical correction algorithms in CT. The distance-driven (DD) projection and backprojection are widely used for their highly sequential memory access pattern and low arithmetic cost. However, a typical DD implementation has an inner loop that adjusts the calculation depending on the relative position between voxel and detector cell boundaries. The irregularity of the branch behavior makes it inefficient to be implemented on massively parallel computing devices such as graphics processing units (GPUs). Such irregular branch behaviors can be eliminated by factorizing the DD operation as three branchless steps: integration, linear interpolation, and differentiation, all of which are highly amenable to massive vectorization. In this paper, we implement and evaluate a highly parallel branchless DD algorithm for 3D cone beam CT. The algorithm utilizes the texture memory and hardware interpolation on GPUs to achieve fast computational speed. The developed branchless DD algorithm achieved 137-fold speedup for forward projection and 188-fold speedup for backprojection relative to a single-thread CPU implementation. Compared with a state-of-the-art 32-thread CPU implementation, the proposed branchless DD achieved 8-fold acceleration for forward projection and 10-fold acceleration for backprojection. GPU based branchless DD method was evaluated by iterative reconstruction algorithms with both simulation and real datasets. It obtained visually identical images as the CPU reference algorithm.
Janssen, Ellen M; Jerome, Gerald J; Dalcin, Arlene T; Gennusa, Joseph V; Goldsholl, Stacy; Frick, Kevin D; Wang, Nae-Yuh; Appel, Lawrence J; Daumit, Gail L
2017-06-01
In the ACHIEVE randomized controlled trial, an 18-month behavioral intervention accomplished weight loss in persons with serious mental illness who attended community psychiatric rehabilitation programs. This analysis estimates costs for delivering the intervention during the study. It also estimates expected costs to implement the intervention more widely in a range of community mental health programs. Using empirical data, costs were calculated from the perspective of a community psychiatric rehabilitation program delivering the intervention. Personnel and travel costs were calculated using time sheet data. Rent and supply costs were calculated using rent per square foot and intervention records. A univariate sensitivity analysis and an expert-informed sensitivity analysis were conducted. With 144 participants receiving the intervention and a mean weight loss of 3.4 kg, costs of $95 per participant per month and $501 per kilogram lost in the trial were calculated. In univariate sensitivity analysis, costs ranged from $402 to $725 per kilogram lost. Through expert-informed sensitivity analysis, it was estimated that rehabilitation programs could implement the intervention for $68 to $85 per client per month. Costs of implementing the ACHIEVE intervention were in the range of other intensive behavioral weight loss interventions. Wider implementation of efficacious lifestyle interventions in community mental health settings will require adequate funding mechanisms. © 2017 The Obesity Society.
Total Quality Management (TQM). Implementers Workshop
1990-05-15
SHEE’T :s t’ii ,rrl DEPARTMENT OF DEFENSE May 15, 1990 Lfl CN I TOTAL QUALITY MANAGEMENT (TQM) Implementers Workshop © Copyright 1990 Booz.Allen...must be continually performed in order to achieve successful TQM implementation. 1-5 = TOTAL QUALITY MANAGEMENT Implementers Workshop Course Content...information, please refer to the student manual, Total Quality Management (TOM) Awareness Seminar, that was provided for the Awareness Course. You may
The Implementation of C-ID, R2D2 Model on Learning Reading Comprehension
ERIC Educational Resources Information Center
Rayanto, Yudi Hari; Rusmawan, Putu Ngurah
2016-01-01
The purposes of this research are to find out, (1) whether C-ID, R2D2 model is effective to be implemented on learning Reading comprehension, (2) college students' activity during the implementation of C-ID, R2D2 model on learning Reading comprehension, and 3) college students' learning achievement during the implementation of C-ID, R2D2 model on…
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Yan, Jerry
1999-01-01
We present an HPF (High Performance Fortran) implementation of ARC3D code along with the profiling and performance data on SGI Origin 2000. Advantages and limitations of HPF as a parallel programming language for CFD applications are discussed. For achieving good performance results we used the data distributions optimized for implementation of implicit and explicit operators of the solver and boundary conditions. We compare the results with MPI and directive based implementations.
A tobacco-free world: a call to action to phase out the sale of tobacco products by 2040.
Beaglehole, Robert; Bonita, Ruth; Yach, Derek; Mackay, Judith; Reddy, K Srinath
2015-03-14
The time has come for the world to acknowledge the unacceptability of the damage being done by the tobacco industry and work towards a world essentially free from the sale (legal and illegal) of tobacco products. A tobacco-free world by 2040, where less than 5% of the world's adult population use tobacco, is socially desirable, technically feasible, and could become politically practical. Three possible ways forward exist: so-called business-as-usual, with most countries steadily implementing the WHO Framework Convention on Tobacco Control (FCTC) provisions; accelerated implementation of the FCTC by all countries; and a so-called turbo-charged approach that complements FCTC actions with strengthened UN leadership, full engagement of all sectors, and increased investment in tobacco control. Only the turbo-charged approach will achieve a tobacco-free world by 2040 where tobacco is out of sight, out of mind, and out of fashion--yet not prohibited. The first and most urgent priority is the inclusion of an ambitious tobacco target in the post-2015 sustainable development health goal. The second priority is accelerated implementation of the FCTC policies in all countries, with full engagement from all sectors including the private sector--from workplaces to pharmacies--and with increased national and global investment. The third priority is an amendment of the FCTC to include an ambitious global tobacco reduction goal. The fourth priority is a UN high-level meeting on tobacco use to galvanise global action towards the 2040 tobacco-free world goal on the basis of new strategies, new resources, and new players. Decisive and strategic action on this bold vision will prevent hundreds of millions of unnecessary deaths during the remainder of this century and safeguard future generations from the ravages of tobacco use. Copyright © 2015 Elsevier Ltd. All rights reserved.
Circuit Design Approaches for Implementation of a Subtrellis IC for a Reed-Muller Subcode
NASA Technical Reports Server (NTRS)
Lin, Shu; Uehara, Gregory T.; Nakamura, Eric B.; Chu, Cecilia W. P.
1996-01-01
In his research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RM subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second(Mbps). The combination of a large number of states and a high data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high- speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to implement the system at high speed. Second, we will describe details of the 8-trellis diagram we found to best meet the trade-offs between chip and overall system complexity. The chosen approach implements the trellis for the (64, 40, 8) RM subcode with 32 independent sub-trellises. And third, we will describe results of our feasibility study on the implementation of such an IC chip in CMOS technology to implement one of these subtrellises.
Circuit Design Approaches for Implementation of a Subtrellis IC for a Reed-Muller Subcode
NASA Technical Reports Server (NTRS)
Lin, Shu; Uehara, Gregory T.; Nakamura, Eric B.; Chu, Cecilia W. P.
1996-01-01
In this research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RM subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a high data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to implement the system at high speed. Second, we will describe details of the 8-trellis diagram we found to best meet the trade-offs between chip and overall system complexity. The chosen approach implements the trellis for the (64, 40, 8) RM subcode with 32 independent sub-trellises. And third, we will describe results of our feasibility study on the implementation of such an IC chip in CMOS technology to implement one of these subtrellises.
Harnessing Implementation Science to Increase the Impact of Health Equity Research.
Chinman, Matthew; Woodward, Eva N; Curran, Geoffrey M; Hausmann, Leslie R M
2017-09-01
Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows 3 steps: detecting (phase 1), understanding (phase 2), and reducing (phase 3), disparities. Although disparities have narrowed over time, many remain. We argue that implementation science could enhance disparities research by broadening the scope of phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in phase 3 studies. We briefly review the focus of phase 2 and phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in phase 3 studies. Many phase 3 studies of disparity-reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real-world practice. Disparities can be considered a "special case" of implementation challenges-when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own.
Conrad, Douglas; Fishman, Paul; Grembowski, David; Ralston, James; Reid, Robert; Martin, Diane; Larson, Eric; Anderson, Melissa
2008-01-01
Objective To estimate the joint effect of a multifaceted access intervention on primary care physician (PCP) productivity in a large, integrated prepaid group practice. Data Sources Administrative records of physician characteristics, compensation and full-time equivalent (FTE) data, linked to enrollee utilization and cost information. Study Design Dependent measures per quarter per FTE were office visits, work relative value units (WRVUs), WRVUs per visit, panel size, and total cost per member per quarter (PMPQ), for PCPs employed >0.25 FTE. General estimating equation regression models were included provider and enrollee characteristics. Principal Findings Panel size and RVUs per visit rose, while visits per FTE and PMPQ cost declined significantly between baseline and full implementation. Panel size rose and visits per FTE declined from baseline through rollout and full implementation. RVUs per visit and RVUs per FTE first declined, and then increased, for a significant net increase of RVUs per visit and an insignificant rise in RVUs per FTE between baseline and full implementation. PMPQ cost rose between baseline and rollout and then declined, for a significant overall decline between baseline and full implementation. Conclusions This organization-wide access intervention was associated with improvements in several dimensions in PCP productivity and gains in clinical efficiency. PMID:18662171
Achieving Technological Literacy in Minnesota.
ERIC Educational Resources Information Center
Lindstrom, Mike
2002-01-01
Describes how Minnesota implemented the Standards for Technological Literacy: Content for the Study of Technology. Includes the timeline, rationale, potential activities and estimated costs associated with all phases, and steps for implementing the plan: investigate, replicate, integrate, and mandate. (JOW)
Promotion of women physicians in academic medicine. Glass ceiling or sticky floor?
Tesch, B J; Wood, H M; Helwig, A L; Nattinger, A B
1995-04-05
To assess possible explanations for the finding that the percentage of women medical school faculty members holding associate or full professor rank remains well below the percentage of men. Cross-sectional survey of physician faculty of US medical schools using the Association of American Medical Colleges (AAMC) database. Surveyed were 153 women and 263 men first appointed between 1979 and 1981, matched for institutions of original faculty appointment. Academic rank achieved, career preparation, academic resources at first appointment, familial responsibilities, and academic productivity. After a mean of 11 years on a medical school faculty, 59% of women compared with 83% of men had achieved associate or full professor rank, and 5% of women compared with 23% of men had achieved full professor rank. Women and men reported similar preparation for an academic career, but women began their careers with fewer academic resources. The number of children was not associated with rank achieved. Women worked about 10% fewer hours per week and had authored fewer publications. After adjustment for productivity factors, women remained less likely to be associate or full professors (adjusted odds ratio [OR] = 0.37; 95% confidence interval [CI], 0.21 to 0.66) or to achieve full professor rank (adjusted OR = 0.27; 95% CI, 0.12 to 0.63). Based on the AAMC database, 50% of both women and men originally appointed as faculty members between 1979 and 1981 had left academic medicine by 1991. Women physician medical school faculty are promoted more slowly than men. Gender differences in rank achieved are not explained by productivity or by differential attrition from academic medicine.
NASA Astrophysics Data System (ADS)
Hyer, E. J.; Schmidt, C. C.; Hoffman, J.; Giglio, L.; Peterson, D. A.
2013-12-01
Polar and geostationary satellites are used operationally for fire detection and smoke source estimation by many near-real-time operational users, including operational forecast centers around the globe. The input satellite radiance data are processed by data providers to produce Level-2 and Level -3 fire detection products, but processing these data into spatially and temporally consistent estimates of fire activity requires a substantial amount of additional processing. The most significant processing steps are correction for variable coverage of the satellite observations, and correction for conditions that affect the detection efficiency of the satellite sensors. We describe a system developed by the Naval Research Laboratory (NRL) that uses the full raster information from the entire constellation to diagnose detection opportunities, calculate corrections for factors such as angular dependence of detection efficiency, and generate global estimates of fire activity at spatial and temporal scales suitable for atmospheric modeling. By incorporating these improved fire observations, smoke emissions products, such as NRL's FLAMBE, are able to produce improved estimates of global emissions. This talk provides an overview of the system, demonstrates the achievable improvement over older methods, and describes challenges for near-real-time implementation.
Design and Implementation of an Intrinsically Safe Liquid-Level Sensor Using Coaxial Cable
Jin, Baoquan; Liu, Xin; Bai, Qing; Wang, Dong; Wang, Yu
2015-01-01
Real-time detection of liquid level in complex environments has always been a knotty issue. In this paper, an intrinsically safe liquid-level sensor system for flammable and explosive environments is designed and implemented. The poly vinyl chloride (PVC) coaxial cable is chosen as the sensing element and the measuring mechanism is analyzed. Then, the capacitance-to-voltage conversion circuit is designed and the expected output signal is achieved by adopting parameter optimization. Furthermore, the experimental platform of the liquid-level sensor system is constructed, which involves the entire process of measuring, converting, filtering, processing, visualizing and communicating. Additionally, the system is designed with characteristics of intrinsic safety by limiting the energy of the circuit to avoid or restrain the thermal effects and sparks. Finally, the approach of the piecewise linearization is adopted in order to improve the measuring accuracy by matching the appropriate calibration points. The test results demonstrate that over the measurement range of 1.0 m, the maximum nonlinearity error is 0.8% full-scale span (FSS), the maximum repeatability error is 0.5% FSS, and the maximum hysteresis error is reduced from 0.7% FSS to 0.5% FSS by applying software compensation algorithms. PMID:26029949
Design and implementation of an intrinsically safe liquid-level sensor using coaxial cable.
Jin, Baoquan; Liu, Xin; Bai, Qing; Wang, Dong; Wang, Yu
2015-05-28
Real-time detection of liquid level in complex environments has always been a knotty issue. In this paper, an intrinsically safe liquid-level sensor system for flammable and explosive environments is designed and implemented. The poly vinyl chloride (PVC) coaxial cable is chosen as the sensing element and the measuring mechanism is analyzed. Then, the capacitance-to-voltage conversion circuit is designed and the expected output signal is achieved by adopting parameter optimization. Furthermore, the experimental platform of the liquid-level sensor system is constructed, which involves the entire process of measuring, converting, filtering, processing, visualizing and communicating. Additionally, the system is designed with characteristics of intrinsic safety by limiting the energy of the circuit to avoid or restrain the thermal effects and sparks. Finally, the approach of the piecewise linearization is adopted in order to improve the measuring accuracy by matching the appropriate calibration points. The test results demonstrate that over the measurement range of 1.0 m, the maximum nonlinearity error is 0.8% full-scale span (FSS), the maximum repeatability error is 0.5% FSS, and the maximum hysteresis error is reduced from 0.7% FSS to 0.5% FSS by applying software compensation algorithms.
Positioning performance of a maglev fine positioning system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wronosky, J.B.; Smith, T.G.; Jordan, J.D.
1996-12-01
A wafer positioning system was recently developed by Sandia National Laboratories for an Extreme Ultraviolet Lithography (EUVL) research tool. The system, which utilizes a magnetically levitated fine stage to provide ultra-precise positioning in all six degrees of freedom, incorporates technological improvements resulting from four years of prototype development experience. System enhancements, implemented on a second generation design for an ARPA National Center for Advanced Information Component Manufacturing (NCAICM) project, introduced active structural control for the levitated structure of the system. Magnetic levitation (maglev) is emerging as an important technology for wafer positioning systems in advanced lithography applications. The advantages ofmore » maglev stem from the absence of physical contact. The resulting lack of friction enables accurate, fast positioning. Maglev systems are mechanically simple, accomplishing full six degree-of-freedom suspension and control with a minimum of moving parts. Power-efficient designs, which reduce the possibility of thermal distortion of the platen, are achievable. Manufacturing throughput will be improved in future systems with the addition of active structural control of the positioning stages. This paper describes the design, implementation, and functional capability of the maglev fine positioning system. Specifics regarding performance design goals and test results are presented.« less
The design and implementation of multi-source application middleware based on service bus
NASA Astrophysics Data System (ADS)
Li, Yichun; Jiang, Ningkang
2017-06-01
With the rapid development of the Internet of Things(IoT), the real-time monitoring data are increasing with different types and large amounts. Aiming at taking full advantages of the data, we designed and implemented an application middleware, which not only supports the three-layer architecture of IoT information system but also enables the flexible configuration of multiple resources access and other accessional modules. The middleware platform shows the characteristics of lightness, security, AoP (aspect-oriented programming), distribution and real-time, which can let application developers construct the information processing systems on related areas in a short period. It focuses not limited to these functions: pre-processing of data format, the definition of data entity, the callings and handlings of distributed service and massive data process. The result of experiment shows that the performance of middleware is more excellent than some message queue construction to some degree and its throughput grows better as the number of distributed nodes increases while the code is not complex. Currently, the middleware is applied to the system of Shanghai Pudong environmental protection agency and achieved a great success.
Pani, Danilo; Barabino, Gianluca; Citi, Luca; Meloni, Paolo; Raspopovic, Stanisa; Micera, Silvestro; Raffo, Luigi
2016-09-01
The control of upper limb neuroprostheses through the peripheral nervous system (PNS) can allow restoring motor functions in amputees. At present, the important aspect of the real-time implementation of neural decoding algorithms on embedded systems has been often overlooked, notwithstanding the impact that limited hardware resources have on the efficiency/effectiveness of any given algorithm. Present study is addressing the optimization of a template matching based algorithm for PNS signals decoding that is a milestone for its real-time, full implementation onto a floating-point digital signal processor (DSP). The proposed optimized real-time algorithm achieves up to 96% of correct classification on real PNS signals acquired through LIFE electrodes on animals, and can correctly sort spikes of a synthetic cortical dataset with sufficiently uncorrelated spike morphologies (93% average correct classification) comparably to the results obtained with top spike sorter (94% on average on the same dataset). The power consumption enables more than 24 h processing at the maximum load, and latency model has been derived to enable a fair performance assessment. The final embodiment demonstrates the real-time performance onto a low-power off-the-shelf DSP, opening to experiments exploiting the efferent signals to control a motor neuroprosthesis.
Urbain, V; Wright, P; Thomas, M
2001-01-01
Stringent effluent quality guidelines are progressively implemented in coastal and sensitive areas in Australia. Biological Nutrient Removal (BNR) plants are becoming a standard often including a tertiary treatment for disinfection. The BNR plant in Noosa - Queensland is designed to produce a treated effluent with less than 5 mg/l of BOD5, 5 mg/l of total nitrogen, 1 mg/l of total phosphorus, 5 mg/l of suspended solids and total coliforms of less than 10/100 ml. A flexible multi-stage biological process with a prefermentation stage, followed by sand filtration and UV disinfection was implemented to achieve this level of treatment. Acetic acid is added for phosphorus removal because: i) the volatile fatty acids (VFA) concentration in raw wastewater varies a lot, and ii) the prefermenter had to be turned off due to odor problems on the primary sedimentation tanks. An endogenous anoxic zone was added to the process to further reduce the nitrate concentration. This resulted in some secondary P-release events, a situation that happens when low nitrate and low phosphorus objectives are targeted. Long-term performance data and specific results on nitrogen removal and disinfection are presented in this paper.
Discrete event performance prediction of speculatively parallel temperature-accelerated dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zamora, Richard James; Voter, Arthur F.; Perez, Danny
Due to its unrivaled ability to predict the dynamical evolution of interacting atoms, molecular dynamics (MD) is a widely used computational method in theoretical chemistry, physics, biology, and engineering. Despite its success, MD is only capable of modeling time scales within several orders of magnitude of thermal vibrations, leaving out many important phenomena that occur at slower rates. The Temperature Accelerated Dynamics (TAD) method overcomes this limitation by thermally accelerating the state-to-state evolution captured by MD. Due to the algorithmically complex nature of the serial TAD procedure, implementations have yet to improve performance by parallelizing the concurrent exploration of multiplemore » states. Here we utilize a discrete event-based application simulator to introduce and explore a new Speculatively Parallel TAD (SpecTAD) method. We investigate the SpecTAD algorithm, without a full-scale implementation, by constructing an application simulator proxy (SpecTADSim). Finally, following this method, we discover that a nontrivial relationship exists between the optimal SpecTAD parameter set and the number of CPU cores available at run-time. Furthermore, we find that a majority of the available SpecTAD boost can be achieved within an existing TAD application using relatively simple algorithm modifications.« less
Discrete event performance prediction of speculatively parallel temperature-accelerated dynamics
Zamora, Richard James; Voter, Arthur F.; Perez, Danny; ...
2016-12-01
Due to its unrivaled ability to predict the dynamical evolution of interacting atoms, molecular dynamics (MD) is a widely used computational method in theoretical chemistry, physics, biology, and engineering. Despite its success, MD is only capable of modeling time scales within several orders of magnitude of thermal vibrations, leaving out many important phenomena that occur at slower rates. The Temperature Accelerated Dynamics (TAD) method overcomes this limitation by thermally accelerating the state-to-state evolution captured by MD. Due to the algorithmically complex nature of the serial TAD procedure, implementations have yet to improve performance by parallelizing the concurrent exploration of multiplemore » states. Here we utilize a discrete event-based application simulator to introduce and explore a new Speculatively Parallel TAD (SpecTAD) method. We investigate the SpecTAD algorithm, without a full-scale implementation, by constructing an application simulator proxy (SpecTADSim). Finally, following this method, we discover that a nontrivial relationship exists between the optimal SpecTAD parameter set and the number of CPU cores available at run-time. Furthermore, we find that a majority of the available SpecTAD boost can be achieved within an existing TAD application using relatively simple algorithm modifications.« less
Automated identification of diagnosis and co-morbidity in clinical records.
Cano, C; Blanco, A; Peshkin, L
2009-01-01
Automated understanding of clinical records is a challenging task involving various legal and technical difficulties. Clinical free text is inherently redundant, unstructured, and full of acronyms, abbreviations and domain-specific language which make it challenging to mine automatically. There is much effort in the field focused on creating specialized ontology, lexicons and heuristics based on expert knowledge of the domain. However, ad-hoc solutions poorly generalize across diseases or diagnoses. This paper presents a successful approach for a rapid prototyping of a diagnosis classifier based on a popular computational linguistics platform. The corpus consists of several hundred of full length discharge summaries provided by Partners Healthcare. The goal is to identify a diagnosis and assign co-morbidi-ty. Our approach is based on the rapid implementation of a logistic regression classifier using an existing toolkit: LingPipe (http://alias-i.com/lingpipe). We implement and compare three different classifiers. The baseline approach uses character 5-grams as features. The second approach uses a bag-of-words representation enriched with a small additional set of features. The third approach reduces a feature set to the most informative features according to the information content. The proposed systems achieve high performance (average F-micro 0.92) for the task. We discuss the relative merit of the three classifiers. Supplementary material with detailed results is available at: http:// decsai.ugr.es/~ccano/LR/supplementary_ material/ We show that our methodology for rapid prototyping of a domain-unaware system is effective for building an accurate classifier for clinical records.
Poddar, Raju; Cortés, Dennis E.; Werner, John S.; Mannis, Mark J.
2013-01-01
Abstract. A high-speed (100 kHz A-scans/s) complex conjugate resolved 1 μm swept source optical coherence tomography (SS-OCT) system using coherence revival of the light source is suitable for dense three-dimensional (3-D) imaging of the anterior segment. The short acquisition time helps to minimize the influence of motion artifacts. The extended depth range of the SS-OCT system allows topographic analysis of clinically relevant images of the entire depth of the anterior segment of the eye. Patients with the type 1 Boston Keratoprosthesis (KPro) require evaluation of the full anterior segment depth. Current commercially available OCT systems are not suitable for this application due to limited acquisition speed, resolution, and axial imaging range. Moreover, most commonly used research grade and some clinical OCT systems implement a commercially available SS (Axsun) that offers only 3.7 mm imaging range (in air) in its standard configuration. We describe implementation of a common swept laser with built-in k-clock to allow phase stable imaging in both low range and high range, 3.7 and 11.5 mm in air, respectively, without the need to build an external MZI k-clock. As a result, 3-D morphology of the KPro position with respect to the surrounding tissue could be investigated in vivo both at high resolution and with large depth range to achieve noninvasive and precise evaluation of success of the surgical procedure. PMID:23912759
NASA Astrophysics Data System (ADS)
Wu, L.; San Segundo Bello, D.; Coppejans, P.; Craninckx, J.; Wambacq, P.; Borremans, J.
2017-02-01
This paper presents a 20 Mfps 32 × 84 pixels CMOS burst-mode imager featuring high frame depth with a passive in-pixel amplifier. Compared to the CCD alternatives, CMOS burst-mode imagers are attractive for their low power consumption and integration of circuitry such as ADCs. Due to storage capacitor size and its noise limitations, CMOS burst-mode imagers usually suffer from a lower frame depth than CCD implementations. In order to capture fast transitions over a longer time span, an in-pixel CDS technique has been adopted to reduce the required memory cells for each frame by half. Moreover, integrated with in-pixel CDS, an in-pixel NMOS-only passive amplifier alleviates the kTC noise requirements of the memory bank allowing the usage of smaller capacitors. Specifically, a dense 108-cell MOS memory bank (10fF/cell) has been implemented inside a 30μm pitch pixel, with an area of 25 × 30μm2 occupied by the memory bank. There is an improvement of about 4x in terms of frame depth per pixel area by applying in-pixel CDS and amplification. With the amplifier's gain of 3.3, an FD input-referred RMS noise of 1mV is achieved at 20 Mfps operation. While the amplification is done without burning DC current, including the pixel source follower biasing, the full pixel consumes 10μA at 3.3V supply voltage at full speed. The chip has been fabricated in imec's 130nm CMOS CIS technology.
ATLAS FTK a - very complex - custom super computer
NASA Astrophysics Data System (ADS)
Kimura, N.; ATLAS Collaboration
2016-10-01
In the LHC environment for high interaction pile-up, advanced techniques of analysing the data in real time are required in order to maximize the rate of physics processes of interest with respect to background processes. The Fast TracKer (FTK) is a track finding implementation at the hardware level that is designed to deliver full-scan tracks with pT above 1 GeV to the ATLAS trigger system for events passing the Level-1 accept (at a maximum rate of 100 kHz). In order to achieve this performance, a highly parallel system was designed and currently it is being commissioned within in ATLAS. Starting in 2016 it will provide tracks for the trigger system in a region covering the central part of the ATLAS detector, and will be extended to the full detector coverage. The system relies on matching hits coming from the silicon tracking detectors against one billion patterns stored in custom ASIC chips (Associative memory chip - AM06). In a first stage, coarse resolution hits are matched against the patterns and the accepted hits undergo track fitting implemented in FPGAs. Tracks with pT > 1GeV are delivered to the High Level Trigger within about 100 ps. Resolution of the tracks coming from FTK is close to the offline tracking and it will allow for reliable detection of primary and secondary vertexes at trigger level and improved trigger performance for b-jets and tau leptons. This contribution will give an overview of the FTK system and present the status of commissioning of the system. Additionally, the expected FTK performance will be briefly described.
ERIC Educational Resources Information Center
Brown, Linda
2012-01-01
Math achievement for students in the United States is not as high as in other countries. In response, one state implemented a new standards-based, integrated math curriculum that combines traditional high school math courses and emphasizes student centered instruction. The purpose of this study was to examine the implementation of a standards…
ERIC Educational Resources Information Center
Bradley, Dominique; Crawford, Evan; Dahill-Brown, Sara E.
2015-01-01
Several studies suggest that values-affirmation can serve as a simple, yet powerful, tool for dramatically reducing achievement gaps. Because subtle variations in implementation procedures may explain some of the variation in these findings, it is crucial for researchers to measure the fidelity with which interventions are implemented. The authors…
ERIC Educational Resources Information Center
League for Innovation in the Community Coll., Los Angeles, CA.
Project USHER is designed to help community colleges implement a humanistic management system. This objective is to be achieved by giving each participating college the capability to redesign its own educational system through implementing a planning, programming, budgeting, and evaluation system (PPBE) within the context of participative…
Integrating Genomic Resources with Electronic Health Records using the HL7 Infobutton Standard
Overby, Casey Lynnette; Del Fiol, Guilherme; Rubinstein, Wendy S.; Maglott, Donna R.; Nelson, Tristan H.; Milosavljevic, Aleksandar; Martin, Christa L.; Goehringer, Scott R.; Freimuth, Robert R.; Williams, Marc S.
2016-01-01
Summary Background The Clinical Genome Resource (ClinGen) Electronic Health Record (EHR) Workgroup aims to integrate ClinGen resources with EHRs. A promising option to enable this integration is through the Health Level Seven (HL7) Infobutton Standard. EHR systems that are certified according to the US Meaningful Use program provide HL7-compliant infobutton capabilities, which can be leveraged to support clinical decision-making in genomics. Objectives To integrate genomic knowledge resources using the HL7 infobutton standard. Two tactics to achieve this objective were: (1) creating an HL7-compliant search interface for ClinGen, and (2) proposing guidance for genomic resources on achieving HL7 Infobutton standard accessibility and compliance. Methods We built a search interface utilizing OpenInfobutton, an open source reference implementation of the HL7 Infobutton standard. ClinGen resources were assessed for readiness towards HL7 compliance. Finally, based upon our experiences we provide recommendations for publishers seeking to achieve HL7 compliance. Results Eight genomic resources and two sub-resources were integrated with the ClinGen search engine via OpenInfobutton and the HL7 infobutton standard. Resources we assessed have varying levels of readiness towards HL7-compliance. Furthermore, we found that adoption of standard terminologies used by EHR systems is the main gap to achieve compliance. Conclusion Genomic resources can be integrated with EHR systems via the HL7 Infobutton standard using OpenInfobutton. Full compliance of genomic resources with the Infobutton standard would further enhance interoperability with EHR systems. PMID:27579472
Liao, C Jason; Quraishi, Jihan A; Jordan, Lorraine M
2015-01-01
The purpose of this study was to determine if there is a relationship between socioeconomic factors related to geography and insurance type and the distribution of anesthesia provider type. Using the 2012 Area Resource File, the correlation analyses illustrates county median income is a key factor in distinguishing anesthesia provider distribution. Certified registered nurse anesthetists (CRNAs) correlated with lower-income populations where anesthesiologists correlated with higher-income populations. Furthermore, CRNAs correlated more with vulnerable populations such as the Medicaid-eligible population, uninsured population, and the unemployed. Access to health care is multifactorial; however, assuring the population has adequate insurance is one of the hallmark achievements of the Affordable Care Act. Removing barriers to CRNA scope of practice to maximize CRNA services will facilitate meeting the demand by vulnerable populations after full implementation of the Affordable Care Act.
Roessler, Christian G; Kuczewski, Anthony; Stearns, Richard; Ellson, Richard; Olechno, Joseph; Orville, Allen M; Allaire, Marc; Soares, Alexei S; Héroux, Annie
2013-09-01
To take full advantage of advanced data collection techniques and high beam flux at next-generation macromolecular crystallography beamlines, rapid and reliable methods will be needed to mount and align many samples per second. One approach is to use an acoustic ejector to eject crystal-containing droplets onto a solid X-ray transparent surface, which can then be positioned and rotated for data collection. Proof-of-concept experiments were conducted at the National Synchrotron Light Source on thermolysin crystals acoustically ejected onto a polyimide `conveyor belt'. Small wedges of data were collected on each crystal, and a complete dataset was assembled from a well diffracting subset of these crystals. Future developments and implementation will focus on achieving ejection and translation of single droplets at a rate of over one hundred per second.
The complete "how to" guide for selecting a disease management vendor.
Linden, Ariel; Roberts, Nancy; Keck, Kevin
2003-01-01
Decision-makers in health plans, large medical groups, and self-insured employers face many challenges in selecting and implementing disease management programs. One strategy is the "buy" approach, utilizing one or more of the many vendors to provide disease management services for the purchasing organization. As a relatively new field, the disease management vendor landscape is continually changing, uncovering the many uncertainties about demonstrating outcomes, corporate stability, or successful business models. Given the large investment an organization may make in each disease management program (many cost 1 million dollars or more in annual fees for a moderately sized population), careful consideration must be given in selecting a disease management partner. This paper describes, in detail, the specific steps necessary and the issues to consider in achieving a successful contract with a vendor for full-service disease management.
Wojdyla, Justyna Aleksandra; Kaminski, Jakub W.; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian
2018-01-01
Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods. PMID:29271779
NASA Astrophysics Data System (ADS)
Sultana, Jakeya; Islam, Md. Saiful; Atai, Javid; Islam, Muhammad Rakibul; Abbott, Derek
2017-07-01
We demonstrate a photonic crystal fiber with near-zero flattened dispersion, ultralower effective material loss (EML), and negligible confinement loss for a broad spectrum range. The use of cyclic olefin copolymer Topas with improved core confinement significantly reduces the loss characteristics and the use of higher air filling fraction results in flat dispersion characteristics. The properties such as dispersion, EML, confinement loss, modal effective area, and single-mode operation of the fiber have been investigated using the full-vector finite element method with the perfectly matched layer absorbing boundary conditions. The practical implementation of the proposed fiber is achievable with existing fabrication techniques as only circular-shaped air holes have been used to design the waveguide. Thus, it is expected that the proposed terahertz waveguide can potentially be used for flexible and efficient transmission of terahertz waves.
Roessler, Christian G.; Kuczewski, Anthony; Stearns, Richard; Ellson, Richard; Olechno, Joseph; Orville, Allen M.; Allaire, Marc; Soares, Alexei S.; Héroux, Annie
2013-01-01
To take full advantage of advanced data collection techniques and high beam flux at next-generation macromolecular crystallography beamlines, rapid and reliable methods will be needed to mount and align many samples per second. One approach is to use an acoustic ejector to eject crystal-containing droplets onto a solid X-ray transparent surface, which can then be positioned and rotated for data collection. Proof-of-concept experiments were conducted at the National Synchrotron Light Source on thermolysin crystals acoustically ejected onto a polyimide ‘conveyor belt’. Small wedges of data were collected on each crystal, and a complete dataset was assembled from a well diffracting subset of these crystals. Future developments and implementation will focus on achieving ejection and translation of single droplets at a rate of over one hundred per second. PMID:23955046
A full 3D-navigation system in a suitcase.
Freysinger, W; Truppe, M J; Gunkel, A R; Thumfart, W F
2001-01-01
To reduce the impact of contemporary 3D-navigation systems on the environment of typical otorhinolaryngologic operating rooms, we demonstrate that a transfer of navigation software to modern high-power notebook computers is feasible and results in a practicable way to provide positional information to a surgeon intraoperatively. The ARTMA Virtual Patient System has been implemented on a Macintosh PowerBook G3 and, in connection with the Polhemus FASTRAK digitizer, provides intraoperative positional information during endoscopic endonasal surgery. Satisfactory intraoperative navigation has been realized in two- and three-dimensional medical image data sets (i.e., X-ray, ultrasound images, CT, and MR) and live video. This proof-of-concept study demonstrates that acceptable ergonomics and excellent performance of the system can be achieved with contemporary high-end notebook computers. Copyright 2001 Wiley-Liss, Inc.
Walker, Daniel M; Hefner, Jennifer L; Sova, Lindsey N; Hilligoss, Brian; Song, Paula H; McAlearney, Ann Scheck
Accountable care organizations (ACOs) are emerging across the healthcare marketplace and now include Medicare, Medicaid, and private sector payers covering more than 24 million lives. However, little is known about the process of organizational change required to achieve cost savings and quality improvements from the ACO model. This study applies the complex innovation implementation framework to understand the challenges and facilitators associated with the ACO implementation process. We conducted four case studies of private sector ACOs, selected to achieve variation in terms of geography and organizational maturity. Across sites, we used semistructured interviews with 68 key informants to elicit information regarding ACO implementation. Our analysis found challenges and facilitators across all domains in the conceptual framework. Notably, our findings deviated from the framework in two ways. First, findings from the financial resource availability domain revealed both financial and nonfinancial (i.e., labor) resources that contributed to implementation effectiveness. Second, a new domain, patient engagement, emerged as an important factor in implementation effectiveness. We present these deviations in an adapted framework. As the ACO model proliferates, these findings can support implementation efforts, and they highlight the importance of focusing on patients throughout the process. Importantly, this study extends the complex innovation implementation framework to incorporate consumers into the implementation framework, making it more patient centered and aiding future efforts.
NASA Astrophysics Data System (ADS)
Sutherland, D. A.; Jarboe, T. R.; Marklin, G.; Morgan, K. D.; Nelson, B. A.
2013-10-01
A high-beta spheromak reactor system has been designed with an overnight capital cost that is competitive with conventional power sources. This reactor system utilizes recently discovered imposed-dynamo current drive (IDCD) and a molten salt blanket system for first wall cooling, neutron moderation and tritium breeding. Currently available materials and ITER developed cryogenic pumping systems were implemented in this design on the basis of technological feasibility. A tritium breeding ratio of greater than 1.1 has been calculated using a Monte Carlo N-Particle (MCNP5) neutron transport simulation. High-temperature superconducting tapes (YBCO) were used for the equilibrium coil set, substantially reducing the recirculating power fraction when compared to previous spheromak reactor studies. Using zirconium hydride for neutron shielding, a limiting equilibrium coil lifetime of at least thirty full-power years has been achieved. The primary FLiBe loop was coupled to a supercritical carbon dioxide Brayton cycle due to attractive economics and high thermal efficiencies. With these advancements, an electrical output of 1000 MW from a thermal output of 2486 MW was achieved, yielding an overall plant efficiency of approximately 40%. A paper concerning the Dynomak reactor design is currently being reviewed for publication.
Implementing Best Practices and Validation of Cryopreservation Techniques for Microorganisms
Smith, David; Ryan, Matthew
2012-01-01
Authentic, well preserved living organisms are basic elements for research in the life sciences and biotechnology. They are grown and utilized in laboratories around the world and are key to many research programmes, industrial processes and training courses. They are vouchers for publications and must be available for confirmation of results, further study or reinvestigation when new technologies become available. These biological resources must be maintained without change in biological resource collections. In order to achieve best practice in the maintenance and provision of biological materials for industry, research and education the appropriate standards must be followed. Cryopreservation is often the best preservation method available to achieve these aims, allowing long term, stable storage of important microorganisms. To promulgate best practice the Organisation for Economic Development and Co-operation (OECD published the best practice guidelines for BRCs. The OECD best practice consolidated the efforts of the UK National Culture Collections, the European Common Access to Biological Resources and Information (CABRI) project consortium and the World Federation for Culture Collections. The paper discusses quality management options and reviews cryopreservation of fungi, describing how the reproducibility and quality of the technique is maintained in order to retain the full potential of fungi. PMID:22629202
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoover, Mark D.; Myers, David S.; Cash, Leigh J.
The National Council on Radiation Protection and Measurements (NCRP) has established NCRP Scientific Committee 2-6 to develop a report on the current state of knowledge and guidance for radiation safety programs involved with nanotechnology. Nanotechnology is the understanding and control of matter at the nanoscale, at dimensions between approximately 1 and 100 nanometers, where unique phenomena enable novel applications. While the full report is in preparation, this article presents and applies an informatics-based decision-making framework and process through which the radiation protection community can anticipate that nano-enabled applications, processes, nanomaterials, and nanoparticles are likely to become present or are alreadymore » present in radiation-related activities; recognize specific situations where environmental and worker safety, health, well-being, and productivity may be affected by nano-related activities; evaluate how radiation protection practices may need to be altered to improve protection; control information, interpretations, assumptions, and conclusions to implement scientifically sound decisions and actions; and confirm that desired protection outcomes have been achieved. This generally applicable framework and supporting process can be continuously applied to achieve health and safety at the convergence of nanotechnology and radiation-related activities.« less
Progress and plan of KSTAR plasma control system upgrade
Hahn, Sang-hee; Kim, Y. J.; Penaflor, B. G.; ...
2016-06-01
The plasma control system (PCS) has been one of essential systems in annual KSTAR plasma campaigns: starting from a single-process version in 2008, extensive upgrades are done through the previous 7 years in order to achieve major goals of KSTAR performance enhancement. Here, major implementations are explained in this paper. In consequences of successive upgrades, the present KSTAR PCS is able to achieve ~48 s of 500 kA plasma pulses with full real-time shaping controls and real-time NB power controls. It has become a huge system capable of dealing with 8 separate categories of algorithms, 26 actuators directly controllable duringmore » the shot, and real-time data communication units consisting of +180 analog channels and +600 digital input/outputs through the reflective memory (RFM) network. The next upgrade of the KSTAR PCS is planned in 2015 before the campaign. An overview of the upgrade layout will be given for this paper. The real-time system box is planned to use the CERN MRG-Realtime OS, an ITER-compatible standard operating system. New hardware is developed for faster real-time streaming system for future installations of actuators/diagnostics.« less
Hoover, Mark D; Myers, David S; Cash, Leigh J; Guilmette, Raymond A; Kreyling, Wolfgang G; Oberdörster, Günter; Smith, Rachel; Cassata, James R; Boecker, Bruce B; Grissom, Michael P
2015-02-01
The National Council on Radiation Protection and Measurements (NCRP) established NCRP Scientific Committee 2-6 to develop a report on the current state of knowledge and guidance for radiation safety programs involved with nanotechnology. Nanotechnology is the understanding and control of matter at the nanoscale, at dimensions between ∼1 and 100 nm, where unique phenomena enable novel applications. While the full report is in preparation, this paper presents and applies an informatics-based decision-making framework and process through which the radiation protection community can anticipate that nano-enabled applications, processes, nanomaterials, and nanoparticles are likely to become present or are already present in radiation-related activities; recognize specific situations where environmental and worker safety, health, well-being, and productivity may be affected by nano-related activities; evaluate how radiation protection practices may need to be altered to improve protection; control information, interpretations, assumptions, and conclusions to implement scientifically sound decisions and actions; and confirm that desired protection outcomes have been achieved. This generally applicable framework and supporting process can be continuously applied to achieve health and safety at the convergence of nanotechnology and radiation-related activities.
Progress and process improvements for multiple electron-beam direct write
NASA Astrophysics Data System (ADS)
Servin, Isabelle; Pourteau, Marie-Line; Pradelles, Jonathan; Essomba, Philippe; Lattard, Ludovic; Brandt, Pieter; Wieland, Marco
2017-06-01
Massively parallel electron beam direct write (MP-EBDW) lithography is a cost-effective patterning solution, complementary to optical lithography, for a variety of applications ranging from 200 to 14 nm. This paper will present last process/integration results to achieve targets for both 28 and 45 nm nodes. For 28 nm node, we mainly focus on line-width roughness (LWR) mitigation by playing with stack, new resist platform and bias design strategy. The lines roughness was reduced by using thicker spin-on-carbon (SOC) hardmask (-14%) or non-chemically amplified (non-CAR) resist with bias writing strategy implementation (-20%). Etch transfer into trilayer has been demonstrated by preserving pattern fidelity and profiles for both CAR and non-CAR resists. For 45 nm node, we demonstrate the electron-beam process integration within optical CMOS flows. Resists based on KrF platform show a full compatibility with multiple stacks to fit with conventional optical flow used for critical layers. Electron-beam resist performances have been optimized to fit the specifications in terms of resolution, energy latitude, LWR and stack compatibility. The patterning process overview showing the latest achievements is mature enough to enable starting the multi-beam technology pre-production mode.
Preliminary Analysis of Double Shell Tomography Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pascucci, V
2009-01-16
In this project we have collaborated with LLNL scientists Dr. Peer-Timo Bremer while performing our research work on algorithmic solutions for geometric processing, image segmentation and data streaming. The main deliverable has been a 3D viewer for high-resolution imaging data with particular focus on the presentation of orthogonal slices of the double shell tomography dataset. Basic probing capabilities allow querying single voxels in the data to study in detail the information presented to the user and compensate for the intrinsic filtering and imprecision due to visualization based on colormaps. On the algorithmic front we have studied the possibility of usingmore » of non-local means filtering algorithm to achieve noise removal from tomography data. In particular we have developed a prototype that implements an accelerated version of the algorithm that may be able to take advantage of the multi-resolution sub-sampling of the ViSUS format. We have achieved promising results. Future plans include the full integration of the non-local means algorithm in the ViSUS frameworks and testing if the accelerated method will scale properly from 2D images to 3D tomography data.« less
Hoover, Mark D.; Myers, David S.; Cash, Leigh J.; ...
2015-01-01
The National Council on Radiation Protection and Measurements (NCRP) has established NCRP Scientific Committee 2-6 to develop a report on the current state of knowledge and guidance for radiation safety programs involved with nanotechnology. Nanotechnology is the understanding and control of matter at the nanoscale, at dimensions between approximately 1 and 100 nanometers, where unique phenomena enable novel applications. While the full report is in preparation, this article presents and applies an informatics-based decision-making framework and process through which the radiation protection community can anticipate that nano-enabled applications, processes, nanomaterials, and nanoparticles are likely to become present or are alreadymore » present in radiation-related activities; recognize specific situations where environmental and worker safety, health, well-being, and productivity may be affected by nano-related activities; evaluate how radiation protection practices may need to be altered to improve protection; control information, interpretations, assumptions, and conclusions to implement scientifically sound decisions and actions; and confirm that desired protection outcomes have been achieved. This generally applicable framework and supporting process can be continuously applied to achieve health and safety at the convergence of nanotechnology and radiation-related activities.« less
Integrated Testing Approaches for the NASA Ares I Crew Launch Vehicle
NASA Technical Reports Server (NTRS)
Taylor, James L.; Cockrell, Charles E.; Tuma, Margaret L.; Askins, Bruce R.; Bland, Jeff D.; Davis, Stephan R.; Patterson, Alan F.; Taylor, Terry L.; Robinson, Kimberly L.
2008-01-01
The Ares I crew launch vehicle is being developed by the U.S. National Aeronautics and Space Administration (NASA) to provide crew and cargo access to the International Space Station (ISS) and, together with the Ares V cargo launch vehicle, serves as a critical component of NASA's future human exploration of the Moon. During the preliminary design phase, NASA defined and began implementing plans for integrated ground and flight testing necessary to achieve the first human launch of Ares I. The individual Ares I flight hardware elements - including the first stage five segment booster (FSB), upper stage, and J-2X upper stage engine - will undergo extensive development, qualification, and certification testing prior to flight. Key integrated system tests include the upper stage Main Propulsion Test Article (MPTA), acceptance tests of the integrated upper stage and upper stage engine assembly, a full-scale integrated vehicle ground vibration test (IVGVT), aerodynamic testing to characterize vehicle performance, and integrated testing of the avionics and software components. The Ares I-X development flight test will provide flight data to validate engineering models for aerodynamic performance, stage separation, structural dynamic performance, and control system functionality. The Ares I-Y flight test will validate ascent performance of the first stage, stage separation functionality, validate the ability of the upper stage to manage cryogenic propellants to achieve upper stage engine start conditions, and a high-altitude demonstration of the launch abort system (LAS) following stage separation. The Orion 1 flight test will be conducted as a full, un-crewed, operational flight test through the entire ascent flight profile prior to the first crewed launch.
Steady state plasma operation in RF dominated regimes on EAST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, X. J.; Zhao, Y. P.; Gong, X. Z.
Significant progress has recently been made on EAST in the 2014 campaign, including the enhanced CW H&CD system over 20MW heating power (LHCD, ICRH and NBI), more than 70 diagnostics, ITER-like W-monoblock on upper divertor, two inner cryo-pumps and RMP coils, enabling EAST to investigate long pulse H mode operation with dominant electron heating and low torque to address the critical issues for ITER. H-mode plasmas were achieved by new H&CD system or 4.6GHz LHCD alone for the first time. Long pulse high performance H mode has been obtained by LHCD alone up to 28s at H{sub 98}∼1.2 or bymore » combing of ICRH and LHCD, no or small ELM was found in RF plasmas, which is essential for steady state operation in the future Tokamak. Plasma operation in low collision regimes were implemented by new 4.6GHz LHCD with core Te∼4.5keV. The non-inductive scenarios with high performance at high bootstrap current fraction have been demonstrated in RF dominated regimes for long pulse operation. Near full non-inductive CD discharges have been achieved. In addition, effective heating and decoupling method under multi-transmitter for ICRF system were developed in this campaign, etc. EAST could be in operation with over 30MW CW heating and current drive power (LHCD ICRH NBI and ECRH), enhanced diagnostic capabilities and full actively-cooled metal wall from 2015. It will therefore allow to access new confinement regimes and to extend these regimes towards to steady state operation.« less
Conception and realization of a semiconductor based 240 GHz full 3D MIMO imaging system
NASA Astrophysics Data System (ADS)
Weisenstein, Christian; Kahl, Matthias; Friederich, Fabian; Haring Bolívar, Peter
2017-02-01
Multiple-input multiple-output (MIMO) imaging systems in the terahertz frequency range have a high potential in the field of non-destructive testing (NDT). With such systems it is possible to detect defects in composite materials, for example cracks or delaminations in fiber composites. To investigate mass-produced products it is necessary to study the objects in close to real-time on a conveyor without affecting the production cycle time. In this work we present the conception and realization of a 3D MIMO imaging system for in-line investigation of composite materials and structures. To achieve a lateral resolution of 1 mm, in order to detect such small defects in composite materials with a moderate number of elements, precise sensor design is crucial. In our approach we use the effective aperture concept. The designed sparse array consists of 32 transmitters and 30 receivers based on planar semiconductor components. High range resolution is achieved by an operating frequency between 220 GHz and 260 GHz in a stepped frequency continuous wave (SFCW) setup. A matched filter approach is used to simulate the reconstructed 3D image through the array. This allows the evaluation of the designed array geometry in regard of resolution and side lobe level. In contrast to earlier demonstrations, in which synthetic reconstruction is only performed in a 2D plane, an optics-free full 3D recon- struction has been implemented in our concept. Based on this simulation we designed an array geometry that enables to resolve objects with a resolution smaller than 1mm and moderate side lobe level.
Technical Data to Justify Full Burnup Credit in Criticality Safety Licensing Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enercon Services, Inc.
2011-03-14
Enercon Services, Inc. (ENERCON) was requested under Task Order No.2 to identify scientific and technical data needed to benchmark and justify Full Burnup Credit, which adds 16 fission products and 4 minor actinides1 to Actinide-Only burnup credit. The historical perspective for Full Burnup Credit is discussed, and interviews of organizations participating in burnup credit activities are summarized as a basis for identifying additional data needs and making recommendation. Input from burnup credit participants representing two segments of the commercial nuclear industry is provided. First, the Electric Power Research Institute (EPRI) has been very active in the development of Full Burnupmore » Credit, representing the interests of nuclear utilities in achieving capacity gains for storage and transport casks. EPRI and its utility customers are interested in a swift resolution of the validation issues that are delaying the implementation of Full Burnup Credit [EPRI 2010b]. Second, used nuclear fuel storage and transportation Cask Vendors favor improving burnup credit beyond Actinide-Only burnup credit, although their discussion of specific burnup credit achievements and data needs was limited citing business sensitive and technical proprietary concerns. While Cask Vendor proprietary items are not specifically identified in this report, the needs of all nuclear industry participants are reflected in the conclusions and recommendations of this report. In addition, Oak Ridge National Laboratory (ORNL) and Sandia National Laboratory (SNL) were interviewed for their input into additional data needs to achieve Full Burnup Credit. ORNL was very open to discussions of Full Burnup Credit, with several telecoms and a visit by ENERCON to ORNL. For many years, ORNL has provided extensive support to the NRC regarding burnup credit in all of its forms. Discussions with ORNL focused on potential resolutions to the validation issues for the use of fission products. SNL was helpful in ENERCON's understanding of the difficult issues related to obtaining and analyzing additional cross section test data to support Full Burnup Credit. A PIRT (Phenomena Identification and Ranking Table) analysis was performed by ENERCON to evaluate the costs and benefits of acquiring different types of nuclear data in support of Full Burnup Credit. A PIRT exercise is a formal expert elicitation process with the final output being the ranking tables. The PIRT analysis (Table 7-4: Results of PIRT Evaluation) showed that the acquisition of additional Actinide-Only experimental data, although beneficial, was associated with high cost and is not necessarily needed. The conclusion was that the existing Radiochemical Assay (RCA) data plus the French Haut Taux de Combustion (HTC)2 and handbook Laboratory Critical Experiment (LCE) data provide adequate benchmark validation for Actinide-Only Burnup Credit. The PIRT analysis indicated that the costs and schedule to obtain sufficient additional experimental data to support the addition of 16 fission products to Actinide-Only Burnup Credit to produce Full Burnup Credit are quite substantial. ENERCON estimates the cost to be $50M to $100M with a schedule of five or more years. The PIRT analysis highlights another option for fission product burnup credit, which is the application of computer-based uncertainty analyses (S/U - Sensitivity/Uncertainty methodologies), confirmed by the limited experimental data that is already available. S/U analyses essentially transform cross section uncertainty information contained in the cross section libraries into a reactivity bias and uncertainty. Recent work by ORNL and EPRI has shown that a methodology to support Full Burnup Credit is possible using a combination of traditional RCA and LCE validation plus S/U validation for fission product isotopics and cross sections. Further, the most recent cross section data (ENDF/B-VII) can be incorporated into the burnup credit codes at a reasonable cost compared to the acquisition of equivalent experimental data. ENERCON concludes that even with the costs of code data library updating, the use of S/U analysis methodologies could be accomplished on a shorter schedule and a lower cost than the gathering of sufficient experimental data. ENERCON estimates of the costs of an updated S/U computer code and data suite are $5M to $10M with a schedule of two to three years. Recent ORNL analyses using the S/U analysis method show that the bias and uncertainty values for fission product cross sections are smaller than previously expected. This result is confirmed by a similar EPRI approach using different data and computer codes. ENERCON also found that some issues regarding the implementation of burnup credit appear to have been successfully resolved especially the axial burnup profile issue and the depletion parameter issue. These issues were resolved through data gathering activities at the Yucca Mountain Project and ORNL.« less
Qualitative analysis of the dynamics of policy design and implementation in hospital funding reform.
Palmer, Karen S; Brown, Adalsteinn D; Evans, Jenna M; Marani, Husayn; Russell, Kirstie K; Martin, Danielle; Ivers, Noah M
2018-01-01
As in many health care systems, some Canadian jurisdictions have begun shifting away from global hospital budgets. Payment for episodes of care has begun to be implemented. Starting in 2012, the Province of Ontario implemented hospital funding reforms comprising three elements: Global Budgets; Health Based Allocation Method (HBAM); and Quality-Based Procedures (QBP). This evaluation focuses on implementation of QBPs, a procedure/diagnosis-specific funding approach involving a pre-set price per episode of care coupled with best practice clinical pathways. We examined whether or not there was consensus in understanding of the program theory underpinning QBPs and how this may have influenced full and effective implementation of this innovative funding model. We undertook a formative evaluation of QBP implementation. We used an embedded case study method and in-depth, one-on-one, semi-structured, telephone interviews with key informants at three levels of the health care system: Designers (those who designed the QBP policy); Adoption Supporters (organizations and individuals supporting adoption of QBPs); and Hospital Implementers (those responsible for QBP implementation in hospitals). Thematic analysis involved an inductive approach, incorporating Framework analysis to generate descriptive and explanatory themes that emerged from the data. Five main findings emerged from our research: (1) Unbeknownst to most key informants, there was neither consistency nor clarity over time among QBP designers in their understanding of the original goal(s) for hospital funding reform; (2) Prior to implementation, the intended hospital funding mechanism transitioned from ABF to QBPs, but most key informants were either unaware of the transition or believe it was intentional; (3) Perception of the primary goal(s) of the policy reform continues to vary within and across all levels of key informants; (4) Four years into implementation, the QBP funding mechanism remains misunderstood; and (5) Ongoing differences in understanding of QBP goals and funding mechanism have created challenges with implementation and difficulties in measuring success. Policy drift and policy layering affected both the goal and the mechanism of action of hospital funding reform. Lack of early specification in both policy goals and hospital funding mechanism exposed the reform to reactive changes that did not reflect initial intentions. Several challenges further exacerbated implementation of complex hospital funding reforms, including a prolonged implementation schedule, turnover of key staff, and inconsistent messaging over time. These factors altered the trajectory of the hospital funding reforms and created confusion amongst those responsible for implementation. Enacting changes to hospital funding policy through a process that is transparent, collaborative, and intentional may increase the likelihood of achieving intended effects.
Chicago-St. Louis high speed rail plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stead, M.E.
1994-12-31
The Illinois Department of Transportation (IDOT), in cooperation with Amtrak, undertook the Chicago-St. Louis High Speed Rail Financial and Implementation Plan study in order to develop a realistic and achievable blueprint for implementation of high speed rail in the Chicago-St. Louis corridor. This report presents a summary of the Price Waterhouse Project Team`s analysis and the Financial and Implementation Plan for implementing high speed rail service in the Chicago-St. Louis corridor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Ravindra; Uluski, Robert; Reilly, James T.
The objective of this survey is to benchmark current practices for DMS implementation to serve as a guide for future system implementations. The survey sought information on current plans to implement DMS, DMS functions of interest, implementation challenges, functional benefits achieved, and other relevant information. These survey results were combined (where possible) with results of similar surveys conducted in the previous four years to observe trends over time.
ERIC Educational Resources Information Center
Harman, Pamela
2014-01-01
Within a suburban school system, an achievement gap exists since not all students are meeting state and national educational benchmarks. Despite the efforts of the school system, the achievement gap is endemic and persistent. To address the achievement gap, the school system instituted a Differentiated Instruction (DI) initiative. However, it was…
Comparing Types of Student Placement and the Effect on Achievement for Students with Disabilities
ERIC Educational Resources Information Center
Mason, Patricia Lynn
2013-01-01
Since implementing No Child Left Behind, schools have improved student achievement while also preparing students for the 21st century. Schools continue to strive for 100% proficiency in all subgroups by 2014, but achievement gap exists for students with disabilities. This study used a causal comparative research design to test the concept of…
ERIC Educational Resources Information Center
Chappell, Shanan; Arnold, Pamela; Nunnery, John; Grant, Melva
2015-01-01
The purpose of this mixed methods study was to determine the impact of synchronous online tutoring services on struggling middle school students' mathematics achievement. The online tutoring was provided as a response to intervention (RTI) Tier 3 support (intensive, individualized intervention) in schools implementing a school-wide mathematics…
Class Size Effects on Fourth-Grade Mathematics Achievement: Evidence from TIMSS 2011
ERIC Educational Resources Information Center
Li, Wei; Konstantopoulos, Spyros
2016-01-01
Class size reduction policies have been widely implemented around the world in recent years. However, findings about the effects of class size on student achievement have been mixed. This study examines class size effects on fourth-grade mathematics achievement in 14 European countries using data from TIMSS (Trends in International Mathematics and…
ERIC Educational Resources Information Center
Stockard, Jean
2010-01-01
This paper examines changes in the average mathematics achievement of students in the Baltimore City Public School System (BCPSS) from 1998 to 2003, comparing students in schools that implemented Direct Instruction with students in other schools. First-grade students who received Direct Instruction had significantly higher levels of achievement on…
ERIC Educational Resources Information Center
Mastrorilli, Tara M.; Harnett, Susanne; Zhu, Jing
2014-01-01
The "Arts Achieve: Impacting Student Success in the Arts" project involves a partnership between the New York City Department of Education (NYCDOE) and five of the city's premier arts organizations. "Arts Achieve" provides intensive and targeted professional development to arts teachers over a three-year period. The goal of the…
Baeten, Marlies; Dochy, Filip; Struyven, Katrien
2013-09-01
Research in higher education on the effects of student-centred versus lecture-based learning environments generally does not take into account the psychological need support provided in these learning environments. From a self-determination theory perspective, need support is important to study because it has been associated with benefits such as autonomous motivation and achievement. The purpose of the study is to investigate the effects of different learning environments on students' motivation for learning and achievement, while taking into account the perceived need support. First-year student teachers (N= 1,098) studying a child development course completed questionnaires assessing motivation and perceived need support. In addition, a prior knowledge test and case-based assessment were administered. A quasi-experimental pre-test/post-test design was set up consisting of four learning environments: (1) lectures, (2) case-based learning (CBL), (3) alternation of lectures and CBL, and (4) gradual implementation with lectures making way for CBL. Autonomous motivation and achievement were higher in the gradually implemented CBL environment, compared to the CBL environment. Concerning achievement, two additional effects were found; students in the lecture-based learning environment scored higher than students in the CBL environment, and students in the gradually implemented CBL environment scored higher than students in the alternated learning environment. Additionally, perceived need support was positively related to autonomous motivation, and negatively to controlled motivation. The study shows the importance of gradually introducing students to CBL, in terms of their autonomous motivation and achievement. Moreover, the study emphasizes the importance of perceived need support for students' motivation. © 2012 The British Psychological Society.
Academic Achievement in Blacks and Whites: Are the Developmental Processes Similar?
ERIC Educational Resources Information Center
Rowe, David C.; Cleveland, Hobart H.
1996-01-01
Genetic and environmental influences on academic achievement were studied for 314 pairs of white full siblings and 53 pairs of half siblings and 161 pairs of black full siblings and 106 half-sibling pairs (National Longitudinal Survey of Youth). Results support a common heritage view of the growth of academic knowledge. (SLD)
Strömberg, Eric A; Nyberg, Joakim; Hooker, Andrew C
2016-12-01
With the increasing popularity of optimal design in drug development it is important to understand how the approximations and implementations of the Fisher information matrix (FIM) affect the resulting optimal designs. The aim of this work was to investigate the impact on design performance when using two common approximations to the population model and the full or block-diagonal FIM implementations for optimization of sampling points. Sampling schedules for two example experiments based on population models were optimized using the FO and FOCE approximations and the full and block-diagonal FIM implementations. The number of support points was compared between the designs for each example experiment. The performance of these designs based on simulation/estimations was investigated by computing bias of the parameters as well as through the use of an empirical D-criterion confidence interval. Simulations were performed when the design was computed with the true parameter values as well as with misspecified parameter values. The FOCE approximation and the Full FIM implementation yielded designs with more support points and less clustering of sample points than designs optimized with the FO approximation and the block-diagonal implementation. The D-criterion confidence intervals showed no performance differences between the full and block diagonal FIM optimal designs when assuming true parameter values. However, the FO approximated block-reduced FIM designs had higher bias than the other designs. When assuming parameter misspecification in the design evaluation, the FO Full FIM optimal design was superior to the FO block-diagonal FIM design in both of the examples.
Downing, Amanda; Mortimer, Molly; Hiers, Jill
2016-03-01
Warfarin is a high alert medication and a challenge to dose and monitor. Pharmacist-driven warfarin management has been shown to decrease the time international normalized ratio (INR) is out of range, which may reduce undesired outcomes. The purpose of this study is to assess the effect of the implementation of a pharmacist-driven warfarin management protocol on the achievement of therapeutic INRs. A warfarin management protocol was developed using evidence based literature and similar protocols from other institutions. Pharmacists utilized the protocol to provide patient specific warfarin dosing upon provider referral. To evaluate the protocol's impact, a retrospective chart review pre- and post-implementation was completed for admitted patients receiving warfarin. Three hundred twenty-seven charts were reviewed for pre- and post-implementation data. INRs within therapeutic range increased from 27.8% before protocol implementation to 38.5% after implementation. There was also a reduction in subtherapeutic INRs (55.3% pre to 39% post) and supratherapeutic INRs 5 or above (3.7% pre to 2.6% post). Supratherapeutic INRs between 3 and 5 did increase from 13.2% before protocol implementation to 19.9% in the pharmacist managed group. In addition to reducing the time to achievement of therapeutic INRs by 0.5 days, implementation of the protocol resulted in an increased the number of patients with at least one therapeutic INR during admission (35% pre to 40% post). The implementation of a pharmacist-driven warfarin dosing protocol increased therapeutic INRs, and decreased the time to therapeutic range, as well as the proportion of subtherapeutic INRs and supratherapeutic INRs 5 or greater. Additional benefits of the protocol include documentation of Joint Commission National Patient Safety Goal compliance, promotion of interdisciplinary collaboration and increased continuity of care. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Essays on environmental regulations in electricity markets
NASA Astrophysics Data System (ADS)
Sun, Yanming
Reducing the Greenhouse Gas pollution and promoting energy efficiency among consumers' energy use have been major public policy issues recently. Currently, both the United States and the European Union have set up explicit percentage requirements that require energy generators or consumers to undertake a certain percentage of their energy production or consumption from renewable sources. To achieve their renewable targets, the Tradable Green Certificates (TGC) system has been introduced in their electricity markets. Moreover, in order to promote energy conservation and achieve energy efficiency targets, price policies and price changes derived from environmental regulations have played a more important role in reducing electricity consumption. My research studies problems associated with these policy implementations. In Chapter 1, I analyze a competitive electricity market with two countries operated under a common TGC system. By using geometric illustrations, I compare the two countries' welfare when the renewable quota is chosen optimally under the common certificate market with three different situations. The policy recommendation is that when the value of damage parameter is sufficiently small, full integration with a TGC market is welfare superior to full integration of an all fossil-fuel based market with an optimal emissions standard. In Chapter 2, by analyzing a stylized theoretical model and numerical examples, I investigate the performance of the optimal renewables policy under full separation and full integration scenarios for two countries' electricity markets operated under TGC systems. In my third chapter, I look at residential electricity consumption responsiveness to increases of electricity price in the U.S. and the different effect of a price increase on electricity use for states of different income levels. My analysis reveals that raising the energy price in the short run will not give consumers much incentive to adjust their appliances and make energy conservation investments to reduce electricity use, while in the long run, consumers are more likely to lower their electricity consumption, facing the higher electricity price induced from regulation policies. In addition, for states of higher per capita GDP, raising the electricity price may be more effective to ensure a cut in electricity consumption.
ERIC Educational Resources Information Center
Kaimal, Girija; Jordan, Will J.
2016-01-01
Context: Policymakers have increasingly advocated for incentive-based approaches for improving urban schools. Purpose of the study: Few studies have examined the implementation of incentive based approaches in the urban charter school context. This paper presents research findings from a 4-year longitudinal study of the implementation of a…
ERIC Educational Resources Information Center
Peppers, Gloria J.
2014-01-01
The purpose of this study was to explore teachers' perceptions prior to the implementation of professional learning communities (PLCs) and after the implementation of PLCs in a large suburban high school. The goal was to provide information that focused on (a) retention and achievement of students, (b) retention of teachers, and (c) teachers'…
ERIC Educational Resources Information Center
Watt, Michael G.
2018-01-01
The purpose of this study was to examine and compare key elements of the actions that states and territories are taking to implement the Australian Curriculum, and what innovative processes and products they are using to facilitate implementation. A rubric adapted from a diagnostic tool, developed by Achieve and the U.S. Education Delivery…
SEPARATIONS AND WASTE FORMS CAMPAIGN IMPLEMENTATION PLAN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vienna, John D.; Todd, Terry A.; Peterson, Mary E.
2012-11-26
This Separations and Waste Forms Campaign Implementation Plan provides summary level detail describing how the Campaign will achieve the objectives set-forth by the Fuel Cycle Reasearch and Development (FCRD) Program. This implementation plan will be maintained as a living document and will be updated as needed in response to changes or progress in separations and waste forms research and the FCRD Program priorities.
Ecological policy in oil-gas complexes, HSE MS implementation in oil and gas company
NASA Astrophysics Data System (ADS)
Kochetkova, O. P.; Glyzina, T. S.; Vazim, A. A.; Tugutova, S. S.
2016-09-01
The paper considers the following issues: HSE MS international standard implementation in oil and gas industry, taking into account international practices; implementation of standards in oil and gas companies; policy in the field of environmental protection and occupational health and safety; achievement of planned indicators and targets in environmental protection and occupational health and safety.
Härter, Martin; Bermejo, Isaac; Ollenschläger, Günter; Schneider, Frank; Gaebel, Wolfgang; Hegerl, Ulrich; Niebling, Wilhelm; Berger, Mathias
2006-04-01
Depressive disorders are of great medical and political significance. The potential inherent in achieving better guideline orientation and a better collaboration between different types of care is clear. Throughout the 1990s, educational initiatives were started for implementing guidelines. Evidence-based guidelines on depression have been formulated in many countries. This article presents an action programme for structural, educational, and research-related measures to implement evidence-based care of depressive disorders in the German health system. The starting points of the programme are the 'Guidelines Critical Appraisal Reports' of the 'Guideline Clearing House' and measures from the 'Competence Network on Depression and Suicidality' (CNDS) funded by the Federal Ministry of Education and Research. The article gives an overview of the steps achieved as recommended by the Guidelines Critical Appraisal Reports and the ongoing transfer process into the German health care system. The action programme shows that comprehensive interventions to develop and introduce evidence-based guidelines for depression can achieve benefits in the care of depression, e.g. in recognition, management, and clinical outcome. It was possible to implement the German Action Programme in selected care settings, and initial evaluation results suggest some improvements. The action programme provides preliminary work, materials, and results for developing a future 'Disease Management Programme' (DMP) for depression.
State of the art of aerobic granulation in continuous flow bioreactors.
Kent, Timothy R; Bott, Charles B; Wang, Zhi-Wu
In the wake of the success of aerobic granulation in sequential batch reactors (SBRs) for treating wastewater, attention is beginning to turn to continuous flow applications. This is a necessary step given the advantages of continuous flow treatment processes and the fact that the majority of full-scale wastewater treatment plants across the world are operated with aeration tanks and clarifiers in a continuous flow mode. As in SBRs, applying a selection pressure, based on differences in either settling velocity or the size of the biomass, is essential for successful granulation in continuous flow reactors (CFRs). CFRs employed for aerobic granulation come in multiple configurations, each with their own means of achieving such a selection pressure. Other factors, such as bioaugmentation and hydraulic shear force, also contribute to aerobic granulation to some extent. Besides the formation of aerobic granules, long-term stability of aerobic granules is also a critical issue to be addressed. Inorganic precipitation, special inocula, and various operational optimization strategies have been used to improve granule long-term structural integrity. Accumulated studies reviewed in this work demonstrate that aerobic granulation in CFRs is capable of removing a wide spectrum of contaminants and achieving properties generally comparable to those in SBRs. Despite the notable research progress made toward successful aerobic granulation in lab-scale CFRs, to the best of our knowledge, there are only three full-scale tests of the technique, two being seeded with anammox-supported aerobic granules and the other with conventional aerobic granules; two other process alternatives are currently in development. Application of settling- or size-based selection pressures and feast/famine conditions are especially difficult to implement to these and similar mainstream systems. Future research efforts needs to be focused on the optimization of the granule-to-floc ratio, enhancement of granule activity, improvement of long-term granule stability, and a better understanding of aerobic granulation mechanisms in CFRs, especially in full-scale applications. Copyright © 2018 Elsevier Inc. All rights reserved.
GPU Lossless Hyperspectral Data Compression System for Space Applications
NASA Technical Reports Server (NTRS)
Keymeulen, Didier; Aranki, Nazeeh; Hopson, Ben; Kiely, Aaron; Klimesh, Matthew; Benkrid, Khaled
2012-01-01
On-board lossless hyperspectral data compression reduces data volume in order to meet NASA and DoD limited downlink capabilities. At JPL, a novel, adaptive and predictive technique for lossless compression of hyperspectral data, named the Fast Lossless (FL) algorithm, was recently developed. This technique uses an adaptive filtering method and achieves state-of-the-art performance in both compression effectiveness and low complexity. Because of its outstanding performance and suitability for real-time onboard hardware implementation, the FL compressor is being formalized as the emerging CCSDS Standard for Lossless Multispectral & Hyperspectral image compression. The FL compressor is well-suited for parallel hardware implementation. A GPU hardware implementation was developed for FL targeting the current state-of-the-art GPUs from NVIDIA(Trademark). The GPU implementation on a NVIDIA(Trademark) GeForce(Trademark) GTX 580 achieves a throughput performance of 583.08 Mbits/sec (44.85 MSamples/sec) and an acceleration of at least 6 times a software implementation running on a 3.47 GHz single core Intel(Trademark) Xeon(Trademark) processor. This paper describes the design and implementation of the FL algorithm on the GPU. The massively parallel implementation will provide in the future a fast and practical real-time solution for airborne and space applications.
High-Speed Soft-Decision Decoding of Two Reed-Muller Codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Uehara, Gregory T.
1996-01-01
In his research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RNI subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a hi ch data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to implement the system at high speed. Second, we will describe details of the 8-trellis diagram we found to best meet the trade-offs between chip and overall system complexity. The chosen approach implements the trellis for the (64, 40, 8) RM subcode with 32 independent sub-trellises. And third, we will describe results of our feasibility study on the implementation of such an IC chip in CMOS technology to implement one of these sub-trellises.
High-Speed Soft-Decision Decoding of Two Reed-Muller Codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Uehara, Gregory T.
1996-01-01
In this research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing, a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RM subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a high data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study, which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating, and present some of the key architectural approaches being used to implement the system at high speed. Second, we will describe details of the 8-trellis diagram we found to best meet the trade-offs between chip and overall system complexity. The chosen approach implements the trellis for the (64, 40, 8) RM subcode with 32 independent sub-trellises. And third, we will describe results of our feasibility study on the implementation of such an IC chip in CMOS technology to implement one of these sub-trellises.
NASA Astrophysics Data System (ADS)
Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara
2014-08-01
A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p < .001. Teacher enactment factors, including adherence to implementing the critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.
Refusal to enrol in Ghana's National Health Insurance Scheme: is affordability the problem?
Kusi, Anthony; Enemark, Ulrika; Hansen, Kristian S; Asante, Felix A
2015-01-17
Access to health insurance is expected to have positive effect in improving access to healthcare and offer financial risk protection to households. Ghana began the implementation of a National Health Insurance Scheme (NHIS) in 2004 as a way to ensure equitable access to basic healthcare for all residents. After a decade of its implementation, national coverage is just about 34% of the national population. Affordability of the NHIS contribution is often cited by households as a major barrier to enrolment in the NHIS without any rigorous analysis of this claim. In light of the global interest in achieving universal health insurance coverage, this study seeks to examine the extent to which affordability of the NHIS contribution is a barrier to full insurance for households and a burden on their resources. The study uses data from a cross-sectional household survey involving 2,430 households from three districts in Ghana conducted between January-April, 2011. Affordability of the NHIS contribution is analysed using the household budget-based approach based on the normative definition of affordability. The burden of the NHIS contributions to households is assessed by relating the expected annual NHIS contribution to household non-food expenditure and total consumption expenditure. Households which cannot afford full insurance were identified. Results show that 66% of uninsured households and 70% of partially insured households could afford full insurance for their members. Enroling all household members in the NHIS would account for 5.9% of household non-food expenditure or 2.0% of total expenditure but higher for households in the first (11.4%) and second (7.0%) socio-economic quintiles. All the households (29%) identified as unable to afford full insurance were in the two lower socio-economic quintiles and had large household sizes. Non-financial factors relating to attributes of the insurer and health system problems also affect enrolment in the NHIS. Affordability of full insurance would be a burden on households with low socio-economic status and large household size. Innovative measures are needed to encourage abled households to enrol. Policy should aim at abolishing the registration fee for children, pricing insurance according to socio-economic status of households and addressing the inimical non-financial factors to increase NHIS coverage.
Finding pathways to national-scale land-sector sustainability.
Gao, Lei; Bryan, Brett A
2017-04-12
The 17 Sustainable Development Goals (SDGs) and 169 targets under Agenda 2030 of the United Nations map a coherent global sustainability ambition at a level of detail general enough to garner consensus amongst nations. However, achieving the global agenda will depend heavily on successful national-scale implementation, which requires the development of effective science-driven targets tailored to specific national contexts and supported by strong national governance. Here we assess the feasibility of achieving multiple SDG targets at the national scale for the Australian land-sector. We scaled targets to three levels of ambition and two timeframes, then quantitatively explored the option space for target achievement under 648 plausible future environmental, socio-economic, technological and policy pathways using the Land-Use Trade-Offs (LUTO) integrated land systems model. We show that target achievement is very sensitive to global efforts to abate emissions, domestic land-use policy, productivity growth rate, and land-use change adoption behaviour and capacity constraints. Weaker target-setting ambition resulted in higher achievement but poorer sustainability outcomes. Accelerating land-use dynamics after 2030 changed the targets achieved by 2050, warranting a longer-term view and greater flexibility in sustainability implementation. Simultaneous achievement of multiple targets is rare owing to the complexity of sustainability target implementation and the pervasive trade-offs in resource-constrained land systems. Given that hard choices are needed, the land-sector must first address the essential food/fibre production, biodiversity and land degradation components of sustainability via specific policy pathways. It may also contribute to emissions abatement, water and energy targets by capitalizing on co-benefits. However, achieving targets relevant to the land-sector will also require substantial contributions from other sectors such as clean energy, food systems and water resource management. Nations require globally coordinated, national-scale, comprehensive, integrated, multi-sectoral analyses to support national target-setting that prioritizes efficient and effective sustainability interventions across societies, economies and environments.
Finding pathways to national-scale land-sector sustainability
NASA Astrophysics Data System (ADS)
Gao, Lei; Bryan, Brett A.
2017-04-01
The 17 Sustainable Development Goals (SDGs) and 169 targets under Agenda 2030 of the United Nations map a coherent global sustainability ambition at a level of detail general enough to garner consensus amongst nations. However, achieving the global agenda will depend heavily on successful national-scale implementation, which requires the development of effective science-driven targets tailored to specific national contexts and supported by strong national governance. Here we assess the feasibility of achieving multiple SDG targets at the national scale for the Australian land-sector. We scaled targets to three levels of ambition and two timeframes, then quantitatively explored the option space for target achievement under 648 plausible future environmental, socio-economic, technological and policy pathways using the Land-Use Trade-Offs (LUTO) integrated land systems model. We show that target achievement is very sensitive to global efforts to abate emissions, domestic land-use policy, productivity growth rate, and land-use change adoption behaviour and capacity constraints. Weaker target-setting ambition resulted in higher achievement but poorer sustainability outcomes. Accelerating land-use dynamics after 2030 changed the targets achieved by 2050, warranting a longer-term view and greater flexibility in sustainability implementation. Simultaneous achievement of multiple targets is rare owing to the complexity of sustainability target implementation and the pervasive trade-offs in resource-constrained land systems. Given that hard choices are needed, the land-sector must first address the essential food/fibre production, biodiversity and land degradation components of sustainability via specific policy pathways. It may also contribute to emissions abatement, water and energy targets by capitalizing on co-benefits. However, achieving targets relevant to the land-sector will also require substantial contributions from other sectors such as clean energy, food systems and water resource management. Nations require globally coordinated, national-scale, comprehensive, integrated, multi-sectoral analyses to support national target-setting that prioritizes efficient and effective sustainability interventions across societies, economies and environments.
Cassim, Naseem; Coetzee, Lindi M; Schnippel, Kathryn; Glencross, Deborah K
2014-01-01
An integrated tiered service delivery model (ITSDM) has been proposed to provide 'full-coverage' of CD4 services throughout South Africa. Five tiers are described, defined by testing volumes and number of referring health-facilities. These include: (1) Tier-1/decentralized point-of-care service (POC) in a single site; Tier-2/POC-hub servicing processing < 30-40 samples from 8-10 health-clinics; Tier-3/Community laboratories servicing ∼ 50 health-clinics, processing < 150 samples/day; high-volume centralized laboratories (Tier-4 and Tier-5) processing < 300 or > 600 samples/day and serving > 100 or > 200 health-clinics, respectively. The objective of this study was to establish costs of existing and ITSDM-tiers 1, 2 and 3 in a remote, under-serviced district in South Africa. Historical health-facility workload volumes from the Pixley-ka-Seme district, and the total volumes of CD4 tests performed by the adjacent district referral CD4 laboratories, linked to locations of all referring clinics and related laboratory-to-result turn-around time (LTR-TAT) data, were extracted from the NHLS Corporate-Data-Warehouse for the period April-2012 to March-2013. Tiers were costed separately (as a cost-per-result) including equipment, staffing, reagents and test consumable costs. A one-way sensitivity analyses provided for changes in reagent price, test volumes and personnel time. The lowest cost-per-result was noted for the existing laboratory-based Tiers- 4 and 5 ($6.24 and $5.37 respectively), but with related increased LTR-TAT of > 24-48 hours. Full service coverage with TAT < 6-hours could be achieved with placement of twenty-seven Tier-1/POC or eight Tier-2/POC-hubs, at a cost-per-result of $32.32 and $15.88 respectively. A single district Tier-3 laboratory also ensured 'full service coverage' and < 24 hour LTR-TAT for the district at $7.42 per-test. Implementing a single Tier-3/community laboratory to extend and improve delivery of services in Pixley-ka-Seme, with an estimated local ∼ 12-24-hour LTR-TAT, is ∼ $2 more than existing referred services per-test, but 2-4 fold cheaper than implementing eight Tier-2/POC-hubs or providing twenty-seven Tier-1/POCT CD4 services.
A comparison between temporal and subband minimum variance adaptive beamforming
NASA Astrophysics Data System (ADS)
Diamantis, Konstantinos; Voxen, Iben H.; Greenaway, Alan H.; Anderson, Tom; Jensen, Jørgen A.; Sboros, Vassilis
2014-03-01
This paper compares the performance between temporal and subband Minimum Variance (MV) beamformers for medical ultrasound imaging. Both adaptive methods provide an optimized set of apodization weights but are implemented in the time and frequency domains respectively. Their performance is evaluated with simulated synthetic aperture data obtained from Field II and is quantified by the Full-Width-Half-Maximum (FWHM), the Peak-Side-Lobe level (PSL) and the contrast level. From a point phantom, a full sequence of 128 emissions with one transducer element transmitting and all 128 elements receiving each time, provides a FWHM of 0.03 mm (0.14λ) for both implementations at a depth of 40 mm. This value is more than 20 times lower than the one achieved by conventional beamforming. The corresponding values of PSL are -58 dB and -63 dB for time and frequency domain MV beamformers, while a value no lower than -50 dB can be obtained from either Boxcar or Hanning weights. Interestingly, a single emission with central element #64 as the transmitting aperture provides results comparable to the full sequence. The values of FWHM are 0.04 mm and 0.03 mm and those of PSL are -42 dB and -46 dB for temporal and subband approaches. From a cyst phantom and for 128 emissions, the contrast level is calculated at -54 dB and -63 dB respectively at the same depth, with the initial shape of the cyst being preserved in contrast to conventional beamforming. The difference between the two adaptive beamformers is less significant in the case of a single emission, with the contrast level being estimated at -42 dB for the time domain and -43 dB for the frequency domain implementation. For the estimation of a single MV weight of a low resolution image formed by a single emission, 0.44 * 109 calculations per second are required for the temporal approach. The same numbers for the subband approach are 0.62 * 109 for the point and 1.33 * 109 for the cyst phantom. The comparison demonstrates similar resolution but slightly lower side-lobes and higher contrast for the subband approach at the expense of increased computation time.
Harnessing Implementation Science to Increase the Impact of Health Disparity Research
Chinman, Matthew; Woodward, Eva N.; Curran, Geoffrey M.; Hausmann, Leslie R. M.
2017-01-01
Background Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows three steps: detecting (Phase 1), understanding (Phase 2), and reducing (Phase 3), disparities. While disparities have narrowed over time, many remain. Objectives We argue that implementation science could enhance disparities research by broadening the scope of Phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in Phase 3 studies. Methods We briefly review the focus of Phase 2 and Phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Results Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in Phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in Phase 3 studies. Many Phase 3 studies of disparity reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real world practice. Conclusions Disparities can be considered a “special case” of implementation challenges—when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own. PMID:28806362
School Choice and Educational Inequality in South Korea
Byun, Soo-yong; Kim, Kyung-keun; Park, Hyunjoon
2014-01-01
This study examined the choice debate in South Korea, which centers on the residentially based school assignment policy called the High School Equalization Policy (HSEP). Using a nationally representative sample of South Korean 11th graders, the study further explored the role of the HSEP in educational equality by investigating how HSEP implementation was related to the separation of low and high socioeconomic status (SES) students between schools and how the socioeconomic composition of a school was related to student achievement. Results showed that the odds that low SES students were separated into low SES schools was smaller in the regions of HSEP implementation, where students were randomly assigned to a school based on place of residence, than in the regions of non-HSEP implementation, where students were allowed to choose a school. Results also showed that student achievement significantly depended on the socioeconomic composition of a school students attended in the regions of non-HSEP implementation, whereas this was not the case in the regions of HSEP implementation. We discussed the implications of these findings for the potential impact of school choice policies on educational inequality. PMID:24834021
Mechanically verified hardware implementing an 8-bit parallel IO Byzantine agreement processor
NASA Technical Reports Server (NTRS)
Moore, J. Strother
1992-01-01
Consider a network of four processors that use the Oral Messages (Byzantine Generals) Algorithm of Pease, Shostak, and Lamport to achieve agreement in the presence of faults. Bevier and Young have published a functional description of a single processor that, when interconnected appropriately with three identical others, implements this network under the assumption that the four processors step in synchrony. By formalizing the original Pease, et al work, Bevier and Young mechanically proved that such a network achieves fault tolerance. We develop, formalize, and discuss a hardware design that has been mechanically proven to implement their processor. In particular, we formally define mapping functions from the abstract state space of the Bevier-Young processor to a concrete state space of a hardware module and state a theorem that expresses the claim that the hardware correctly implements the processor. We briefly discuss the Brock-Hunt Formal Hardware Description Language which permits designs both to be proved correct with the Boyer-Moore theorem prover and to be expressed in a commercially supported hardware description language for additional electrical analysis and layout. We briefly describe our implementation.
Factors that enable and hinder the implementation of projects in the alcohol and other drug field.
MacLean, Sarah; Berends, Lynda; Hunter, Barbara; Roberts, Bridget; Mugavin, Janette
2012-02-01
Few studies systematically explore elements of successful project implementation across a range of alcohol and other drug (AOD) activities. This paper provides an evidence base to inform project implementation in the AOD field. We accessed records for 127 completed projects funded by the Alcohol, Education and Rehabilitation Foundation from 2002 to 2008. An adapted realist synthesis methodology enabled us to develop categories of enablers and barriers to successful project implementation, and to identify factors statistically associated with successful project implementation, defined as meeting all funding objectives. Thematic analysis of eight case study projects allowed detailed exploration of findings. Nine enabler and 10 barrier categories were identified. Those most frequently reported as both barriers and enablers concerned partnerships with external agencies and communities, staffing and project design. Achieving supportive relationships with partner agencies and communities, employing skilled staff and implementing consumer or participant input mechanisms were statistically associated with successful project implementation. The framework described here will support development of evidence-based project funding guidelines and project performance indicators. The study provides evidence that investing project hours and resources to develop robust relationships with project partners and communities, implementing mechanisms for consumer or participant input and attracting skilled staff are legitimate and important activities, not just in themselves but because they potentially influence achievement of project funding objectives. © 2012 The Authors. ANZJPH © 2012 Public Health Association of Australia.
TotalReCaller: improved accuracy and performance via integrated alignment and base-calling.
Menges, Fabian; Narzisi, Giuseppe; Mishra, Bud
2011-09-01
Currently, re-sequencing approaches use multiple modules serially to interpret raw sequencing data from next-generation sequencing platforms, while remaining oblivious to the genomic information until the final alignment step. Such approaches fail to exploit the full information from both raw sequencing data and the reference genome that can yield better quality sequence reads, SNP-calls, variant detection, as well as an alignment at the best possible location in the reference genome. Thus, there is a need for novel reference-guided bioinformatics algorithms for interpreting analog signals representing sequences of the bases ({A, C, G, T}), while simultaneously aligning possible sequence reads to a source reference genome whenever available. Here, we propose a new base-calling algorithm, TotalReCaller, to achieve improved performance. A linear error model for the raw intensity data and Burrows-Wheeler transform (BWT) based alignment are combined utilizing a Bayesian score function, which is then globally optimized over all possible genomic locations using an efficient branch-and-bound approach. The algorithm has been implemented in soft- and hardware [field-programmable gate array (FPGA)] to achieve real-time performance. Empirical results on real high-throughput Illumina data were used to evaluate TotalReCaller's performance relative to its peers-Bustard, BayesCall, Ibis and Rolexa-based on several criteria, particularly those important in clinical and scientific applications. Namely, it was evaluated for (i) its base-calling speed and throughput, (ii) its read accuracy and (iii) its specificity and sensitivity in variant calling. A software implementation of TotalReCaller as well as additional information, is available at: http://bioinformatics.nyu.edu/wordpress/projects/totalrecaller/ fabian.menges@nyu.edu.
Klumb, Evandro Mendes; Silva, Clovis Artur Almeida; Lanna, Cristina Costa Duarte; Sato, Emilia Inoue; Borba, Eduardo Ferreira; Brenol, João Carlos Tavares; de Albuquerque, Elisa Martins das Neves; Monticielo, Odirlei Andre; Costallat, Lilian Tereza Lavras; Latorre, Luiz Carlos; Sauma, Maria de Fátima Lobato da Cunha; Bonfá, Eloisa Silva Dutra de Oliveira; Ribeiro, Francinne Machado
2015-01-01
To develop recommendations for the diagnosis, management and treatment of lupus nephritis in Brazil. Extensive literature review with a selection of papers based on the strength of scientific evidence and opinion of the Commission on Systemic Lupus Erythematosus members, Brazilian Society of Rheumatology. 1) Renal biopsy should be performed whenever possible and if this procedure is indicated; and, when the procedure is not possible, the treatment should be guided with the inference of histologic class. 2) Ideally, measures and precautions should be implemented before starting treatment, with emphasis on attention to the risk of infection. 3) Risks and benefits of treatment should be shared with the patient and his/her family. 4) The use of hydroxychloroquine (preferably) or chloroquine diphosphate is recommended for all patients (unless contraindicated) during induction and maintenance phases. 5) The evaluation of the effectiveness of treatment should be made with objective criteria of response (complete remission/partial remission/refractoriness). 6) ACE inhibitors and/or ARBs are recommended as antiproteinuric agents for all patients (unless contraindicated). 7) The identification of clinical and/or laboratory signs suggestive of proliferative or membranous glomerulonephritis should indicate an immediate implementation of specific therapy, including steroids and an immunosuppressive agent, even though histological confirmation is not possible. 8) Immunosuppressives must be used during at least 36 months, but these medications can be kept for longer periods. Its discontinuation should only be done when the patient achieve and maintain a sustained and complete remission. 9) Lupus nephritis should be considered as refractory when a full or partial remission is not achieved after 12 months of an appropriate treatment, when a new renal biopsy should be considered to assist in identifying the cause of refractoriness and in the therapeutic decision. Copyright © 2014 Elsevier Editora Ltda. All rights reserved.
Autonomous control system reconfiguration for spacecraft with non-redundant actuators
NASA Astrophysics Data System (ADS)
Grossman, Walter
1995-05-01
The Small Satellite Technology Initiative (SSTI) 'CLARK' spacecraft is required to be single-failure tolerant, i.e., no failure of any single component or subsystem shall result in complete mission loss. Fault tolerance is usually achieved by implementing redundant subsystems. Fault tolerant systems are therefore heavier and cost more to build and launch than non-redundent, non fault-tolerant spacecraft. The SSTI CLARK satellite Attitude Determination and Control System (ADACS) achieves single-fault tolerance without redundancy. The attitude determination system system uses a Kalman Filter which is inherently robust to loss of any single attitude sensor. The attitude control system uses three orthogonal reaction wheels for attitude control and three magnetic dipoles for momentum control. The nominal six-actuator control system functions by projecting the attitude correction torque onto the reaction wheels while a slower momentum management outer loop removes the excess momentum in the direction normal to the local B field. The actuators are not redundant so the nominal control law cannot be implemented in the event of a loss of a single actuator (dipole or reaction wheel). The spacecraft dynamical state (attitude, angular rate, and momentum) is controllable from any five-element subset of the six actuators. With loss of an actuator the instantaneous control authority may not span R(3) but the controllability gramian integral(limits between t,0) Phi(t, tau)B(tau )B(prime)(tau) Phi(prime)(t, tau)d tau retains full rank. Upon detection of an actuator failure the control torque is decomposed onto the remaining active axes. The attitude control torque is effected and the over-orbit momentum is controlled. The resulting control system performance approaches that of the nominal system.
Sgaier, Sema K; Baer, James; Rutz, Daniel C; Njeuhmeli, Emmanuel; Seifert-Ahanda, Kim; Basinga, Paulin; Parkyn, Rosie; Laube, Catharine
2015-01-01
By the end of 2014, an estimated 8.5 million men had undergone voluntary medical male circumcision (VMMC) for HIV prevention in 14 priority countries in eastern and southern Africa, representing more than 40% of the global target. However, demand, especially among men most at risk for HIV infection, remains a barrier to realizing the program's full scale and potential impact. We analyzed current demand generation interventions for VMMC by reviewing the available literature and reporting on field visits to programs in 7 priority countries. We present our findings and recommendations using a framework with 4 components: insight development; intervention design; implementation and coordination to achieve scale; and measurement, learning, and evaluation. Most program strategies lacked comprehensive insight development; formative research usually comprised general acceptability studies. Demand generation interventions varied across the countries, from advocacy with community leaders and community mobilization to use of interpersonal communication, mid- and mass media, and new technologies. Some shortcomings in intervention design included using general instead of tailored messaging, focusing solely on the HIV preventive benefits of VMMC, and rolling out individual interventions to address specific barriers rather than a holistic package. Interventions have often been scaled-up without first being evaluated for effectiveness and cost-effectiveness. We recommend national programs create coordinated demand generation interventions, based on insights from multiple disciplines, tailored to the needs and aspirations of defined subsets of the target population, rather than focused exclusively on HIV prevention goals. Programs should implement a comprehensive intervention package with multiple messages and channels, strengthened through continuous monitoring. These insights may be broadly applicable to other programs where voluntary behavior change is essential to achieving public health benefits. PMID:26085019
NASA Technical Reports Server (NTRS)
Ronbinson, Julie A.; Harm, Deborah L.
2009-01-01
As the International Space Station (ISS) nears completion, and full international utilization is achieved, we are at a scientific crossroads. ISS is the premier location for research aimed at understanding the effects of microgravity on the human body. For applications to future human exploration, it is key for validation, quantification, and mitigation of a wide variety of spaceflight risks to health and human performance. Understanding and mitigating these risks is the focus of NASA s Human Research Program. However, NASA s approach to defining human research objectives is only one of many approaches within the ISS international partnership (including Roscosmos, the European Space Agency, the Canadian Space Agency, and the Japan Aerospace Exploration Agency). Each of these agencies selects and implements their own ISS research, with independent but related objectives for human and life sciences research. Because the science itself is also international and collaborative, investigations that are led by one ISS partner also often include cooperative scientists from around the world. The operation of the ISS generates significant additional data that is not directly linked to specific investigations. Such data comes from medical monitoring of crew members, life support and radiation monitoring, and from the systems that have been implemented to protect the health of the crew (such as exercise hardware). We provide examples of these international synergies in human research on ISS and highlight key early accomplishments that derive from these broad interfaces. Taken as a whole, the combination of diverse research objectives, operational data, international sharing of research resources on ISS, and scientific collaboration provide a robust research approach and capability that no one partner could achieve alone.
Parks, Michael J; Kingsbury, John H; Boyle, Raymond G; Choi, Kelvin
2017-10-01
Tobacco use is a leading behavioral risk factor for morbidity and mortality, and the tobacco epidemic disproportionately affects low-socioeconomic status (SES) populations. Taxation is effective for reducing cigarette use, and it is an effective population-based policy for reducing SES-related tobacco disparities. However, progress in implementing cigarette excise taxes has stalled across the United States, and there is a dearth of research on the full spectrum of behavioral shifts that result from taxes, particularly among low-SES populations. This project documents the impact of Minnesota's $1.75 cigarette tax increase implemented in 2013. Data come from the 2014 Minnesota Adult Tobacco Survey. Descriptive analyses and Latent Class Analysis (LCA) were used to provide a typology of the tax impact. From the LCA, six classes were identified, and 42% of respondents were classified as reporting action-oriented behavioral change related to the tax-8% reported sustained smoking abstinence. We found differential behavior change across levels of SES. Low-SES and medium/high-SES individuals were equally likely to report complete tobacco cessation, but the prevalence of daily smokers who reported action-oriented behavior without sustained cessation was nearly double for low-SES individuals. Smokers report a range of behavioral changes in response to cigarette taxes, with differences across SES. The majority of smokers, and particularly low-SES smokers, report behavioral steps toward quitting or achieving sustained tobacco cessation in response to cigarette taxes. Complementary population-based programs geared toward assisting individuals, especially low-SES individuals, to achieve continuous tobacco cessation could increase the reach and effectiveness of cigarette taxes. Copyright © 2017 Elsevier Ltd. All rights reserved.
New parity, same old attitude towards psychotherapy?
Clemens, Norman A
2010-03-01
Full parity of health insurance benefits for treatment of mental illness, including substance use disorders, is a major achievement. However, the newly-published regulations implementing the legislation strongly endorse aggressive managed care as a way of containing costs for the new equality of coverage. Reductions in "very long episodes of out-patient care," hospitalization, and provider fees, along with increased utilization, are singled out as achievements of managed care. Medical appropriateness as defined by expert medical panels is to be the basis of authorizing care, though clinicians are familiar with a history of insurance companies' application of "medical necessity" to their own advantage. The regulations do not single out psychotherapy for attention, but long-term psychotherapy geared to the needs of each patient appears to be at risk. The author recommends that the mental health professions strongly advocate for the growing evidence base for psychotherapy including long-term therapy for complex mental disorders; respect for the structure and process of psychotherapy individualized to patients' needs; awareness of the costs of aggressive managed care in terms of money, time, administrative burden, and interference with the therapy; and recognition of the extensive training and experience required to provide psychotherapy as well as the stresses and demands of the work. Parity in out-of-network benefits could lead to aggressive management of care given by non-network practitioners. Since a large percentage of psychiatrists and other mental health professionals stay out of networks, implementation of parity for out-of-network providers will have to be done in a way that respects the conditions under which they would be willing and able to provide services, especially psychotherapy, to insured patients. The shortage of psychiatrists makes this an important access issue for the insured population in need of care.
NASA Astrophysics Data System (ADS)
Martin, Lynn A.
The purpose of this study was to examine the relationship between teachers' self-reported preparedness for teaching science content and their instructional practices to the science achievement of eighth grade science students in the United States as demonstrated by TIMSS 2007. Six hundred eighty-seven eighth grade science teachers in the United States representing 7,377 students responded to the TIMSS 2007 questionnaire about their instructional preparedness and their instructional practices. Quantitative data were reported. Through correlation analysis, the researcher found statistically significant positive relationships emerge between eighth grade science teachers' main area of study and their self-reported beliefs about their preparedness to teach that same content area. Another correlation analysis found a statistically significant negative relationship existed between teachers' self-reported use of inquiry-based instruction and preparedness to teach chemistry, physics and earth science. Another correlation analysis discovered a statistically significant positive relationship existed between physics preparedness and student science achievement. Finally, a correlation analysis found a statistically significant positive relationship existed between science teachers' self-reported implementation of inquiry-based instructional practices and student achievement. The data findings support the conclusion that teachers who have feelings of preparedness to teach science content and implement more inquiry-based instruction and less didactic instruction produce high achieving science students. As science teachers obtain the appropriate knowledge in science content and pedagogy, science teachers will feel prepared and will implement inquiry-based instruction in science classrooms.
Case Study on the Journey of an Elementary School Labeled as a Persistently Low-Achieving School
ERIC Educational Resources Information Center
Duncan, Annette
2012-01-01
This study examined an elementary school, located in an urban school district, which was labeled as a Persistently Low-Achieving School (PLAS) by the federal government in 2009 in order to determine how the school planned to change leadership and staff; increase student achievement; and implement new approaches for changes in school climate. The…
ERIC Educational Resources Information Center
Brooks, Sherri L.
2013-01-01
The purpose of this correlational study was to determine if there was a relationship between professional learning community (PLC), personal teacher efficacy (PTE), and student achievement. The study examined teacher perception of PLC implementation and PET as it related to student achievement at the high school level on the Virginia End-of Course…
ERIC Educational Resources Information Center
Emmett, Joshua; McGee, Dean
2013-01-01
The purpose of this case study was to discover the critical attributes of a student achievement program, known as "Think Gold," implemented at one urban comprehensive high school as part of the improvement process. Student achievement on state assessments improved during the period under study. The study draws upon perspectives on…
ERIC Educational Resources Information Center
What Works Clearinghouse, 2008
2008-01-01
This review examined a study designed to evaluate whether the Achievement Challenge Pilot Project, a performance-pay program for teachers, improved the academic achievement of elementary school students. Study authors reported higher student test score gains for students in schools that implemented the performance- pay program than for students in…
ERIC Educational Resources Information Center
Lynch, Virginia N.
2013-01-01
Use of external vendors to implement school reform and address student achievement in urban secondary schools has not been studied. This quasi-experimental longitudinal study focused on changes in student achievement among urban 9th grade students during the 2010-2011 school year. The theoretical framework was the transformational model of using…
Hearing protector fit testing with off-shore oil-rig inspectors in Louisiana and Texas.
Murphy, William J; Themann, Christa L; Murata, Taichi K
2016-11-01
This field study aimed to assess the noise reduction of hearing protection for individual workers, demonstrate the effectiveness of training on the level of protection achieved, and measure the time required to implement hearing protector fit testing in the workplace. The National Institute for Occupational Safety and Health (NIOSH) conducted field studies in Louisiana and Texas to test the performance of HPD Well-Fit. Fit tests were performed on 126 inspectors and engineers working in the offshore oil industry. Workers were fit tested with the goal of achieving a 25-dB PAR. Less than half of the workers were achieving sufficient protection from their hearing protectors prior to NIOSH intervention and training; following re-fitting and re-training, over 85% of the workers achieved sufficient protection. Typical test times were 6-12 minutes. Fit testing of the workers' earplugs identified those workers who were and were not achieving the desired level of protection. Recommendations for other hearing protection solutions were made for workers who could not achieve the target PAR. The study demonstrates the need for individual hearing protector fit testing and addresses some of the barriers to implementation.
Bandwidth efficient coding for satellite communications
NASA Technical Reports Server (NTRS)
Lin, Shu; Costello, Daniel J., Jr.; Miller, Warner H.; Morakis, James C.; Poland, William B., Jr.
1992-01-01
An error control coding scheme was devised to achieve large coding gain and high reliability by using coded modulation with reduced decoding complexity. To achieve a 3 to 5 dB coding gain and moderate reliability, the decoding complexity is quite modest. In fact, to achieve a 3 dB coding gain, the decoding complexity is quite simple, no matter whether trellis coded modulation or block coded modulation is used. However, to achieve coding gains exceeding 5 dB, the decoding complexity increases drastically, and the implementation of the decoder becomes very expensive and unpractical. The use is proposed of coded modulation in conjunction with concatenated (or cascaded) coding. A good short bandwidth efficient modulation code is used as the inner code and relatively powerful Reed-Solomon code is used as the outer code. With properly chosen inner and outer codes, a concatenated coded modulation scheme not only can achieve large coding gains and high reliability with good bandwidth efficiency but also can be practically implemented. This combination of coded modulation and concatenated coding really offers a way of achieving the best of three worlds, reliability and coding gain, bandwidth efficiency, and decoding complexity.
Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT
NASA Astrophysics Data System (ADS)
Wynne, Ben; ATLAS Collaboration
2017-10-01
We present an implementation of the ATLAS High Level Trigger, HLT, that provides parallel execution of trigger algorithms within the ATLAS multithreaded software framework, AthenaMT. This development will enable the ATLAS HLT to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data-taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the HLT input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that each execute algorithms sequentially for different events. AthenaMT will provide a fully multi-threaded environment that will additionally enable concurrent execution of algorithms within an event. This has the potential to significantly reduce the memory footprint on future manycore devices. An additional benefit of the HLT implementation within AthenaMT is that it facilitates the integration of offline code into the HLT. The trigger must retain high rejection in the face of increasing numbers of pileup collisions. This will be achieved by greater use of offline algorithms that are designed to maximize the discrimination of signal from background. Therefore a unification of the HLT and offline reconstruction software environment is required. This has been achieved while at the same time retaining important HLT-specific optimisations that minimize the computation performed to reach a trigger decision. Such optimizations include early event rejection and reconstruction within restricted geometrical regions. We report on an HLT prototype in which the need for HLT-specific components has been reduced to a minimum. Promising results have been obtained with a prototype that includes the key elements of trigger functionality including regional reconstruction and early event rejection. We report on the first experience of migrating trigger selections to this new framework and present the next steps towards a full implementation of the ATLAS trigger.
Code of Federal Regulations, 2011 CFR
2011-07-01
... IMPLEMENTATION OF RECIPROCITY OF FACILITIES Guidelines for the Implementation and Oversight of the Policy on Reciprocity of use and Inspections of Facilities § 148.10 General. (a) Redundant, overlapping, and duplicative... excessive protection and unnecessary expenditure of funds. Lack of reciprocity has also impeded achievement...
Code of Federal Regulations, 2010 CFR
2010-07-01
... IMPLEMENTATION OF RECIPROCITY OF FACILITIES Guidelines for the Implementation and Oversight of the Policy on Reciprocity of use and Inspections of Facilities § 148.10 General. (a) Redundant, overlapping, and duplicative... excessive protection and unnecessary expenditure of funds. Lack of reciprocity has also impeded achievement...
Code of Federal Regulations, 2014 CFR
2014-07-01
... IMPLEMENTATION OF RECIPROCITY OF FACILITIES Guidelines for the Implementation and Oversight of the Policy on Reciprocity of use and Inspections of Facilities § 148.10 General. (a) Redundant, overlapping, and duplicative... excessive protection and unnecessary expenditure of funds. Lack of reciprocity has also impeded achievement...
Code of Federal Regulations, 2012 CFR
2012-07-01
... IMPLEMENTATION OF RECIPROCITY OF FACILITIES Guidelines for the Implementation and Oversight of the Policy on Reciprocity of use and Inspections of Facilities § 148.10 General. (a) Redundant, overlapping, and duplicative... excessive protection and unnecessary expenditure of funds. Lack of reciprocity has also impeded achievement...
Code of Federal Regulations, 2013 CFR
2013-07-01
... IMPLEMENTATION OF RECIPROCITY OF FACILITIES Guidelines for the Implementation and Oversight of the Policy on Reciprocity of use and Inspections of Facilities § 148.10 General. (a) Redundant, overlapping, and duplicative... excessive protection and unnecessary expenditure of funds. Lack of reciprocity has also impeded achievement...
Filling the Black Box of Implementation for Health-Promoting Schools
ERIC Educational Resources Information Center
Rowling, Louise; Samdal, Oddrun
2011-01-01
Purpose: Achieving organisational learning and greater specificity for implementation action for health-promoting schools requires detailed understanding of the necessary components. They include: preparing and planning for school development, policy and institutional anchoring, professional development and learning, leadership and management…
NASA Technical Reports Server (NTRS)
Lin, Shu (Principal Investigator); Uehara, Gregory T.; Nakamura, Eric; Chu, Cecilia W. P.
1996-01-01
The (64, 40, 8) subcode of the third-order Reed-Muller (RM) code for high-speed satellite communications is proposed. The RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. The progress made toward achieving the goal of implementing a decoder system based upon this code is summarized. The development of the integrated circuit prototype sub-trellis IC, particularly focusing on the design methodology, is addressed.
A Comparative Study on the Architecture Internet of Things and its’ Implementation method
NASA Astrophysics Data System (ADS)
Xiao, Zhiliang
2017-08-01
With the rapid development of science and technology, Internet-based the Internet of things was born and achieved good results. In order to further build a complete Internet of things system, to achieve the design of the Internet of things, we need to constitute the object of the network structure of the indicators of comparative study, and on this basis, the Internet of things connected to the way and do more in-depth to achieve the unity of the object network architecture and implementation methods. This paper mainly analyzes the two types of Internet of Things system, and makes a brief comparative study of the important indicators, and then introduces the connection method and realization method of Internet of Things based on the concept of Internet of Things and architecture.
24-Hour Relativistic Bit Commitment.
Verbanis, Ephanielle; Martin, Anthony; Houlmann, Raphaël; Boso, Gianluca; Bussières, Félix; Zbinden, Hugo
2016-09-30
Bit commitment is a fundamental cryptographic primitive in which a party wishes to commit a secret bit to another party. Perfect security between mistrustful parties is unfortunately impossible to achieve through the asynchronous exchange of classical and quantum messages. Perfect security can nonetheless be achieved if each party splits into two agents exchanging classical information at times and locations satisfying strict relativistic constraints. A relativistic multiround protocol to achieve this was previously proposed and used to implement a 2-millisecond commitment time. Much longer durations were initially thought to be insecure, but recent theoretical progress showed that this is not so. In this Letter, we report on the implementation of a 24-hour bit commitment solely based on timed high-speed optical communication and fast data processing, with all agents located within the city of Geneva. This duration is more than 6 orders of magnitude longer than before, and we argue that it could be extended to one year and allow much more flexibility on the locations of the agents. Our implementation offers a practical and viable solution for use in applications such as digital signatures, secure voting and honesty-preserving auctions.
Achievement of learning outcome after implemented physical modules based on problem based learning
NASA Astrophysics Data System (ADS)
Isna, R.; Masykuri, M.; Sukarmin
2018-03-01
Implementation of Problem BasedLearning (PBL) modules can grow the students' thinking skills to solve the problems in daily life and equip the students into higher education levels. The purpose of this research is to know the achievement of learning outcome after implementation physics module based on PBL in Newton,s Law of Gravity. This research method use the experimental method with posttest only group design. To know the achievement of student learning outcomes was analyzed using t test through application of SPSS 18. Based on research result, it is found that the average of student learning outcomes after appliying physics module based on PBL has reached the minimal exhaustiveness criteria. In addition, students' scientific attitudes also improved at each meeting. Presentation activities which contained at learning sync are also able to practice speaking skills and broaden their knowledge. Looking at some shortcomings during the study, it is suggested the issues raised into learning should be a problem close to the life of students so that, the students are more active and enthusiastic in following the learning of physics.
Applying industrial symbiosis to chemical industry: A literature review
NASA Astrophysics Data System (ADS)
Cui, Hua; Liu, Changhao
2017-08-01
Chemical industry plays an important role in promoting the development of global economy and human society. However, the negative effects caused by chemical production cannot be ignored, which often leads to serious resource consumption and environmental pollution. It is essential for chemical industry to achieve a sustainable development. Industrial symbiosis is one of the key topics in the field of industrial ecology and circular economy, which has been identified as a creative path leading to sustainability. Based on an extensively searching for literatures on linking industrial symbiosis with chemical industry, this paper aims to review the literatures which involves three aspects: (1) economic and environmental benefits achieved by chemical industry through implementing industrial symbiosis, (2) chemical eco-industrial parks, (3) and safety issues for chemical industry. An outlook is also provided. This paper concludes that: (1) chemical industry can achieve both economic and environmental benefits by implementing industrial symbiosis, (2) establishing eco-industrial parks is essential for chemical industry to implement and improve industrial symbiosis, and (3) there is a close relationship between IS and safety issues of chemical industry.
ERIC Educational Resources Information Center
Brenton, Beatrice White; Gilmore, Doug
1976-01-01
An operational index of discrepancy to assist in identifying learning disabilities was derived using the Full Scale IQ, Wechsler Intelligence Scale for Children, and relevant subtest scores on the Peabody Individual Achievement Test. Considerable caution should be exercised when classifying children, especially females, as learning disabled.…
André, Beate; Sjøvold, Endre
2017-07-14
To successfully achieve change in healthcare, a balance between technology and "people ware", the human recourses, is necessary. However, the human aspect of the change implementation process has received less attention than the technological issues. The aim was to explore the factors that characterize the work culture in a hospital unit that successfully implemented change compared with the factors that characterize the work culture of a hospital unit with unsuccessful implementation. The Systematizing Person-Group Relations method was used for gathering and analyzing data to explore what dominate the behavior in a particular work environment identifying challenges, limitations and opportunities. This method applied six different dimensions, each representing different behavior in a work culture: Synergy, Withdrawal, Opposition, Dependence, Control and Nurture. We compared two different units at the same hospital, one that successfully implemented change and one that was unsuccessful. There were significant statistical differences between healthcare personnel working at a unit that successfully implemented change contrasted with the unit with unsuccessful implementation. These significant differences were found in both the synergy and control dimensions, which are important positive qualities in a work culture. The results of this study show that healthcare personnel at a unit with a successful implementation of change have a working environment with many positive qualities. This indicates that a work environment with a high focus on goal achievement and task orientation can handle the challenges of implementing changes.
Adams, Vanessa M.; Pressey, Robert L.
2014-01-01
Land use change is the most significant driver linked to global species extinctions. In Northern Australia, the landscape is still relatively intact with very low levels of clearing. However, a re-energized political discourse around creating a northern food bowl means that currently intact ecosystems in northern Australia could be under imminent threat from increased land clearing and water extraction. These impacts are likely to be concentrated in a few regions with suitable soils and water supplies. The Daly River Catchment in the Northern Territory is an important catchment for both conservation and development. Land use in the Daly catchment has been subject to clearing guidelines that are largely untested in terms of their eventual implications for the spatial configuration of conservation and development. Given the guidelines are not legislated they might also be removed or revised by subsequent Territory Governments, including the recently-elected one. We examine the uncertainties around the spatial implications of full implementation of the Daly clearing guidelines and their potential effects on equity of opportunity across land tenures and land uses. We also examine how removal of the guidelines could affect conservation in the catchment. We conclude that the guidelines are important in supporting development in the catchment while still achieving conservation goals, and we recommend ways of implementing the guidelines to make best use of available land resources for intensified production. PMID:24798486
Implementation of Complex Signal Processing Algorithms for Position-Sensitive Microcalorimeters
NASA Technical Reports Server (NTRS)
Smith, Stephen J.
2008-01-01
We have recently reported on a theoretical digital signal-processing algorithm for improved energy and position resolution in position-sensitive, transition-edge sensor (POST) X-ray detectors [Smith et al., Nucl, lnstr and Meth. A 556 (2006) 2371. PoST's consists of one or more transition-edge sensors (TES's) on a large continuous or pixellated X-ray absorber and are under development as an alternative to arrays of single pixel TES's. PoST's provide a means to increase the field-of-view for the fewest number of read-out channels. In this contribution we extend the theoretical correlated energy position optimal filter (CEPOF) algorithm (originally developed for 2-TES continuous absorber PoST's) to investigate the practical implementation on multi-pixel single TES PoST's or Hydras. We use numerically simulated data for a nine absorber device, which includes realistic detector noise, to demonstrate an iterative scheme that enables convergence on the correct photon absorption position and energy without any a priori assumptions. The position sensitivity of the CEPOF implemented on simulated data agrees very well with the theoretically predicted resolution. We discuss practical issues such as the impact of random arrival phase of the measured data on the performance of the CEPOF. The CEPOF algorithm demonstrates that full-width-at- half-maximum energy resolution of < 8 eV coupled with position-sensitivity down to a few 100 eV should be achievable for a fully optimized device.